From 134b06b95bedb4b7cfac1b17d2983021cb2d04d8 Mon Sep 17 00:00:00 2001 From: pa-lem Date: Thu, 3 Aug 2023 11:42:14 +0200 Subject: [PATCH 001/446] update infrahub favicons --- frontend/public/favicon.ico | Bin 3870 -> 7440 bytes frontend/public/logo192.png | Bin 5347 -> 30044 bytes frontend/public/logo512.png | Bin 9664 -> 89921 bytes 3 files changed, 0 insertions(+), 0 deletions(-) diff --git a/frontend/public/favicon.ico b/frontend/public/favicon.ico index a11777cc471a4344702741ab1c8a588998b1311a..e8f21f7bb12e47f7e3947621fac8e9d000107ac6 100644 GIT binary patch literal 7440 zcmZ`;Wmr{Pw_c<)DAL{KX4A20>5}e_P4@;tq`Q$$N$F5Tx}+75MjGkvPPuT-x!*bW zJl~z`nQM&sj`5CIbN*S+3|D?7jfqB#1^@u2tc--}gA{$7D98`*Qfudh2Z3lQrYHsg zRk7%|CP)u!N;4T%MF8-m2LQhy0JwgD{5Ao=jST>{jR63X3IGI7nT;xf4~8J9wycGs zBEbBhqX0;7&j5r64esFq;D~|8Xb&172S@TBT@{Y;FCTaS2(<sSG zp4G(M(G<$+Vej ztu6k}_NE_E(aBV*l`e z><^-B?E$sZmaw*mI=~*g;d$)lzgqsy6nC_9bWw9MF@p-T|3&!+^l$G!^631(dH!Mf zizx_x-0y$v|DT8J5A@-538M*u|8r7>(Y!T*I{+Xemz5Ax_kf$%L-o*@sQK-pmNnutt2!?{7g|5m4GNXu@E+K zc+|9wj5+5v-#vjV-Fw5ag$>fh-WUDJSLWuo9ls`^2jM5C z369wl-EJ|vyMS?8?9HuoF`-Y8uq(m$=lkET&+xMtyRaQH7Ff=iGE_O=IB4{JIvy_k zagX_Y5*N`Z(|Rp%E~Q}4S89vL{M9?Xc&cf&YsGY}Fufus%4oe18%~>$zD%|QKICYM zk@3M5qKoc>w&PA4zo^2*vmJZF_<1VUuKToapN5&5Cdm+yrMd0idO3@l<;zB=-h9IA zV<>jK=hH8*yxEa7>onb+Qb%V`nUZVr`F)de_oTq;@XHx1&3@{8j@K5}oAw;r26nz4 z+tWRH^4)zJgYSp5E(Qsx+TTa;_`a#qt=PW%k*z(lqseLNp!wZwqj;5_^Zv=5loa0< z(W2PwM6c3@jN1k2nW0Yg`q@yHR%Gt9=|*jDa+@)T2~_;2 zpOu54P-V8Mp2Y97UrwJdzkq`oX2gW-A-#}BDBI&Z$o?bnYhO)v56?@UIw+Ww4=M%GrOAle7~9 zFK!h(TWepvy&q!1Kr4;dN{CPD(!XDKzTC4bjLG72Q=Yy9P2FbDmQl_ zY8+pz_!AT2C&`?4zkChM`dl}*ETT+5f1x~mRTN6(YfQJ3__gS7Oelk?sR8ln*O*Aq z9;2nNg)i-~q+AM8U`ikj?S&ILy#{;;`Qcnf+~9LhT4vE?0Y^` z!GZL?I3+YH8wPH`jt|Gbb5M3sY{9kiL2L5;LA&EgnyYz{+1Tg2&y*7IeC77Tka+2H z>?Oi7q?~1$H1AhHCkF%~d!e}qUhx5UETsd<-NAalJT-{J@2QkHC_i}Y74QQds0-ew z16iN|FD(*0Wb!*|?7Ao!c99y7Tq92#$GX7J(|c21o1{IfnWn9l((Y)E433uk$6~wF z6T>~W9dCFk0)?|GIP)=QTkon7F}8?)2q@Y)+}qUI_l)A5-zp8=^hWRd8!rZSp1HMk z)nXD%DZe|EcG%}e7?IEs@m_mh+%)H0;x*I7--dq5eQT{H)&O?}{h1Uy2TjcqT3`@- zF|G)1D<@X+b;cNazg`^rglcdS=mFTK#OV<@6e@cvsc(wi$H11g=FKAUI;#JUed{=2I2_7I0UUG{eSK!s)dM)`#|1wrJmOXso{{IY&Ar zpCs^keUgk28tD_vnVmUqPis!D12*urw9b27K+kcEC2bitlz#41H2ki>Xg!6{sdeD= zg5lsKJCS>a;YGQ%`6~h~)9!P@W{*bFI9C@k#;+zZ=Bd>Llf{vrFa@_{NZxs}yHnZG z{Lr^w)gCe^ZNtXWwW6(|f3Yi$s&4BqO`xHoW9{KR-)0x-bCrmhCWOF|uS$=-Lt}}P=8J-j7JtD9!jEB>&-x4O6OGlcW+91v7s8!O z9KdJcA?J(y-ZW&TH-v&M!M=A3uOS>D6c*hSQTzr`x(ahbaA-iFIeYV<{~W7mGgXjJ zH!`-E4yB4K8>RMsrUa8>fiQN~(>*vg_MDoy+WQ8(^oGJWmwjUo9EjU97xo#lvazN) z3%}Y7lNw@!3cCcmK!MDT&^ex5GX>>uzlkqh8TmXbm!f;|uaxS&KNvYFkFD-Ys34IS zRkUd;BLyKjb$P}0subONszZb96GwA)Q7jpsi@oX#`s?A!b z53N(pEY7zC9@@-P=?GuCPZVA>mZxH<8N9MKVb!_ujCe=IvFfXYt|_tq2hd%j%gQ)0 zsKj0;o3yj#V86W~o|$cFy!=8=?$nxOezfRQE!>EG02xQEJxigB&dAc<=dwZc*A#&P zAHgp)%W~uL32+u;M;nn(z21Ubl3rTu;CLyzokQniGuW(+Z*446bebO+55$I#$r zc9$499fY-pIn1IJByms0yOV=n2H-qvVtEP1uxu8z z{rIa=;BpGDNp-pCi(;R+CWHm?a;x4aq5~ZcP84EB!$r6gH=`XISmVZLWYV=aFJQb? z>{CxwMZ&K3O~{8F%6(Q(B}}Y>C@$wL$=%AY74{m`G13BiTb?dLJcjkY>p*xa`n_eu zaH&0^XK=`}j1yl6O3=K6IAG2Fxh)c3z-#O6%m`OGHCH5`v`A>{_DR`ZC7NI7=Vl7` zE|cJ~QKMj}(?_=Kf>BUP=)Pa@0wf0|rQerf+koLLDc=vfFjdL7kNia;ze zsyLincZ0NYy?5|E^{Rg3ayitQ%nW19C61EKdiE^t+263;-3b+&hwc63fe+3FEl!qV zCaRIZe-DgqKFr}IsKa99LaUV55;Vl~V-d+jW~jdBOCOPowp#Rl8A0 zk@4^1b}J(Ch}*npyTYExvg|6_5N|}R)%WcbhO=Q$iG)T(x}VBnlZ!-NxcdhB%Q|n) zvXG~2J`tY3!Sw8P>-n;D=Pd6@Z8&p^^t=9s&va8s&er;155h4JBt9Z1DG!S2Oww-M zuCYM)+DaNs71@cU--u$X!@j(Qf^AURZB}>Q{;?hJsIvf z1Vn{=;N5`D?`Ri5F;Qb*hBBAp&C(G-w#3cTDAf4cE>aCilJD)(v+Dv{)9K6?>Jydi zu!f?MrfAGJMKN&jFK=Fb5fiDANC?3D~Z-aeDhx_>gbYBa7}1-myX_wc6aAcVsoA`bmEKptBBy5eXzBjfO@@{TPH z#K`!f#|(UrD}*nIO4^WlmgDMMJ{FtP2+#M4q~7Dzi`?k~BKpM-gNR5) z$G~@FB#a)`t=)UsOA`&V z%8>=I=plO0y*9ceRwa%asd|gz*X0ElEE(>bL#e+Z?dc*wTFPrWUmIy>A--$LPnbCU zxZ5bq!FZvJk|>;tlNPyPF2_kPplbqc`85N8VWx{z-htW~SA>iLz$Yu1Fr`xG zhm2AeFH@@{A$9giL}0VMx1e{wWyG+bWpCd?L>MyweZhtUkY10BO9zyD16yb^)M7L{ zx(KT;Px{<|uULqHq`%}HUn3SSjw({~zWGZ5ag*c==$BMb-f4Vg*(Wy7*R0_Jv&`_h z9rb-`8nSezR`!$rJ7FNEOj;=n*sI=cAmAz#WoHyXx0m$#8fyU${|TUMWCW<=5M zMo(vIi%Rd5Jfz3KbW@g;MD#Phd`hczX-;zCfr=A_g|doH|Fm|KdRIe(%{9i9B|)Iq zeV9Hh4{m1bs8-|#-;xEWM?JpgPF}b5V`B-6yhvgq5UO3J>Fe_tSl0dWf@Q^qEhfK! z?iqd{-kO5eS{4{-9AU`P{3WfV3mg;j!dUIWMr`eGeVHm8uEd z>R?erJWCdn3rIwUMRgV)Ac4NQgh`4@veP<9+qFxYR{pfkD*g$~J!j5J@}gKK$OSGx z1}tpe$lRgFS5=ixjDj&{m^I0*I~>pQbv}P)(Nu-3bs1d5@SM|~_GIo22@bag36g*>0z z{%n=Um7|>}zTDRL>#)T&|1eBE>_t+K1x(3eK6SV8%VWM)qZ`6cqN1-t;Ia2WXJK}dgB|N|HOS6#b0J;ffxi&QuDo!SgHvn_)982Ut$wkQOth# zK?AspPR%r4O#{UvxAt5%^5nxtM0W!+zbNpRTRh|#RTH+E8KU3-S~9V zOP|-K^W?f1prhJ%PQ7GmPP4_%m&g5ko8%O+mdW=>voIGnyD2u(sq9Fb!XHP?3;EL6 ziHXT8U;Xb=IiP62Z1#F7?ho1W^7D^I) zDz2l*D_tAXG@ZzZDTRjBYQ!5@MR?LhFNfX zX5`BCuksd>!A3C+unp7W)$fE5S4Pc`hS6;6h_LkmF$oF8h{(t)n+fLDBg4Y2o-T$C zFeOIsu{D=KI z!A>G!H?xg>PZR4})Wi3f3D_1^Zun%4TuOr~>}<``TLi?+;3G0oe~Su#`DFrDeNjot z;9BS}JzZEu(E^n`u0(_0sa9ZH<2RtSkm-FwV*ul?k}4c;^Nc%>sZemFd-@L z5c_CxD}ElOgSGHWvqwIVc?)c0y;9DvTe8F7 zh??O^w_FFPNy=&#btz(mrAPX4b7Sbd7$nFrbmOFvqZ~7_o>i9-=TcZlf89z~IuKgY z7<59x3I9!AJ(TB@M&?WFH*}LyWcZ*!+5`FwKdZd+z4VoEMAr%9=9V#b?k_a7lZV}hJ-@T_hxshPd0{j z@-I;xmkcW2(B?SI6uP+D5X2mNqQ7?d716ZcK+vXh3v^h$R1IioBz^{Q~OoOh;jd%Ps!CbOinY2RiA zqPhldlHRM|O)qO?2N!_CCVJ`D=HC5HQ(=;xo^auj%w=nGD_Jnm#>4V&7Jp(HCAqu}S0_)G| z3#L1Eit104GcMW3lILA};RTQ_m?W^Ryv}~q)XqQ-hV;{>OPUT^^*H+wrRnJDw;PK; z6%i+V)=OoX5Sl&Tp6H){{XL#8f@(tk5Y30}Y@knzpWqc26|A|Bl($!={Qwre$P{>^ z3u89SZpR`{VYk{2c)z8#$~mQ;xtpXp1Hv5QZ^ zkBJa|e-csaI=7vl{Zih&l|tJ+s3&|haw;u7KeViMs(EP6lgaH47?0z|FH}O5am`^P zav5AZbqMrBHD*9f?pSnk(#KYZijtBu_<$?}kbeCz$b7QW;3FzZh!`F&?uw7Y)@Z@@ zXo0-p7y16ai4_*Z<>DgkICjkf%O54NO>Q$_{oAd3j7q(|d3Dw@k2?e$k=LH1n7A9O zw2GcSaP)912&6zgsRY6+gkgn3EB z5x&2Lf!o*2Z?u~1=0g;7^LIWjOAV>ENDbWrywQ8R>zcxe&B=&g??g{n3`J=z6z|=7 zx~4G+dD5aTYK7abxnWR^B!ZR9*|P3Pf+i#-Ma8w9yFwJIZPv+e5ewwrrz#q%s>_o3 zsT;nxCouJk%4^Q+4=S9@5X=n7PVu^l~RJ?y+uxZJ!Dc!e-$V3gN3Pv1{0s?DOIx>xSR@tsU}|uUrzRxCIVp zvV5+syMxsaY9&)H8*VxE-NGHN^?epSdJ|RnZpnpH+BC&J&J41s9)_a+W;!ynJkwX7zTVVK5jC@^rPi{d%k-$B0%56W_|lou=xSx?>3NDpg7pyEV}HE%HS{Y2chxP?BxJE^y=#mo=LncmoWv_~zSDT^9g%k`s zS_WLiz9;v0y|Unf)2QdSfr)K{gZDo_Rn3Wq7(Zrnk3;5wlqN*6KSqZV4#Q$Z$! ypN7zsd literal 3870 zcma);c{J4h9>;%nil|2-o+rCuEF-(I%-F}ijC~o(k~HKAkr0)!FCj~d>`RtpD?8b; zXOC1OD!V*IsqUwzbMF1)-gEDD=A573Z-&G7^LoAC9|WO7Xc0Cx1g^Zu0u_SjAPB3vGa^W|sj)80f#V0@M_CAZTIO(t--xg= z!sii`1giyH7EKL_+Wi0ab<)&E_0KD!3Rp2^HNB*K2@PHCs4PWSA32*-^7d{9nH2_E zmC{C*N*)(vEF1_aMamw2A{ZH5aIDqiabnFdJ|y0%aS|64E$`s2ccV~3lR!u<){eS` z#^Mx6o(iP1Ix%4dv`t@!&Za-K@mTm#vadc{0aWDV*_%EiGK7qMC_(`exc>-$Gb9~W!w_^{*pYRm~G zBN{nA;cm^w$VWg1O^^<6vY`1XCD|s_zv*g*5&V#wv&s#h$xlUilPe4U@I&UXZbL z0)%9Uj&@yd03n;!7do+bfixH^FeZ-Ema}s;DQX2gY+7g0s(9;`8GyvPY1*vxiF&|w z>!vA~GA<~JUqH}d;DfBSi^IT*#lrzXl$fNpq0_T1tA+`A$1?(gLb?e#0>UELvljtQ zK+*74m0jn&)5yk8mLBv;=@}c{t0ztT<v;Avck$S6D`Z)^c0(jiwKhQsn|LDRY&w(Fmi91I7H6S;b0XM{e zXp0~(T@k_r-!jkLwd1_Vre^v$G4|kh4}=Gi?$AaJ)3I+^m|Zyj#*?Kp@w(lQdJZf4 z#|IJW5z+S^e9@(6hW6N~{pj8|NO*>1)E=%?nNUAkmv~OY&ZV;m-%?pQ_11)hAr0oAwILrlsGawpxx4D43J&K=n+p3WLnlDsQ$b(9+4 z?mO^hmV^F8MV{4Lx>(Q=aHhQ1){0d*(e&s%G=i5rq3;t{JC zmgbn5Nkl)t@fPH$v;af26lyhH!k+#}_&aBK4baYPbZy$5aFx4}ka&qxl z$=Rh$W;U)>-=S-0=?7FH9dUAd2(q#4TCAHky!$^~;Dz^j|8_wuKc*YzfdAht@Q&ror?91Dm!N03=4=O!a)I*0q~p0g$Fm$pmr$ zb;wD;STDIi$@M%y1>p&_>%?UP($15gou_ue1u0!4(%81;qcIW8NyxFEvXpiJ|H4wz z*mFT(qVx1FKufG11hByuX%lPk4t#WZ{>8ka2efjY`~;AL6vWyQKpJun2nRiZYDij$ zP>4jQXPaP$UC$yIVgGa)jDV;F0l^n(V=HMRB5)20V7&r$jmk{UUIe zVjKroK}JAbD>B`2cwNQ&GDLx8{pg`7hbA~grk|W6LgiZ`8y`{Iq0i>t!3p2}MS6S+ zO_ruKyAElt)rdS>CtF7j{&6rP-#c=7evGMt7B6`7HG|-(WL`bDUAjyn+k$mx$CH;q2Dz4x;cPP$hW=`pFfLO)!jaCL@V2+F)So3}vg|%O*^T1j>C2lx zsURO-zIJC$^$g2byVbRIo^w>UxK}74^TqUiRR#7s_X$e)$6iYG1(PcW7un-va-S&u zHk9-6Zn&>T==A)lM^D~bk{&rFzCi35>UR!ZjQkdSiNX*-;l4z9j*7|q`TBl~Au`5& z+c)*8?#-tgUR$Zd%Q3bs96w6k7q@#tUn`5rj+r@_sAVVLqco|6O{ILX&U-&-cbVa3 zY?ngHR@%l{;`ri%H*0EhBWrGjv!LE4db?HEWb5mu*t@{kv|XwK8?npOshmzf=vZA@ zVSN9sL~!sn?r(AK)Q7Jk2(|M67Uy3I{eRy z_l&Y@A>;vjkWN5I2xvFFTLX0i+`{qz7C_@bo`ZUzDugfq4+>a3?1v%)O+YTd6@Ul7 zAfLfm=nhZ`)P~&v90$&UcF+yXm9sq!qCx3^9gzIcO|Y(js^Fj)Rvq>nQAHI92ap=P z10A4@prk+AGWCb`2)dQYFuR$|H6iDE8p}9a?#nV2}LBCoCf(Xi2@szia7#gY>b|l!-U`c}@ zLdhvQjc!BdLJvYvzzzngnw51yRYCqh4}$oRCy-z|v3Hc*d|?^Wj=l~18*E~*cR_kU z{XsxM1i{V*4GujHQ3DBpl2w4FgFR48Nma@HPgnyKoIEY-MqmMeY=I<%oG~l!f<+FN z1ZY^;10j4M4#HYXP zw5eJpA_y(>uLQ~OucgxDLuf}fVs272FaMxhn4xnDGIyLXnw>Xsd^J8XhcWIwIoQ9} z%FoSJTAGW(SRGwJwb=@pY7r$uQRK3Zd~XbxU)ts!4XsJrCycrWSI?e!IqwqIR8+Jh zlRjZ`UO1I!BtJR_2~7AbkbSm%XQqxEPkz6BTGWx8e}nQ=w7bZ|eVP4?*Tb!$(R)iC z9)&%bS*u(lXqzitAN)Oo=&Ytn>%Hzjc<5liuPi>zC_nw;Z0AE3Y$Jao_Q90R-gl~5 z_xAb2J%eArrC1CN4G$}-zVvCqF1;H;abAu6G*+PDHSYFx@Tdbfox*uEd3}BUyYY-l zTfEsOqsi#f9^FoLO;ChK<554qkri&Av~SIM*{fEYRE?vH7pTAOmu2pz3X?Wn*!ROX ztd54huAk&mFBemMooL33RV-*1f0Q3_(7hl$<#*|WF9P!;r;4_+X~k~uKEqdzZ$5Al zV63XN@)j$FN#cCD;ek1R#l zv%pGrhB~KWgoCj%GT?%{@@o(AJGt*PG#l3i>lhmb_twKH^EYvacVY-6bsCl5*^~L0 zonm@lk2UvvTKr2RS%}T>^~EYqdL1q4nD%0n&Xqr^cK^`J5W;lRRB^R-O8b&HENO||mo0xaD+S=I8RTlIfVgqN@SXDr2&-)we--K7w= zJVU8?Z+7k9dy;s;^gDkQa`0nz6N{T?(A&Iz)2!DEecLyRa&FI!id#5Z7B*O2=PsR0 zEvc|8{NS^)!d)MDX(97Xw}m&kEO@5jqRaDZ!+%`wYOI<23q|&js`&o4xvjP7D_xv@ z5hEwpsp{HezI9!~6O{~)lLR@oF7?J7i>1|5a~UuoN=q&6N}EJPV_GD`&M*v8Y`^2j zKII*d_@Fi$+i*YEW+Hbzn{iQk~yP z>7N{S4)r*!NwQ`(qcN#8SRQsNK6>{)X12nbF`*7#ecO7I)Q$uZsV+xS4E7aUn+U(K baj7?x%VD!5Cxk2YbYLNVeiXvvpMCWYo=by@ diff --git a/frontend/public/logo192.png b/frontend/public/logo192.png index fc44b0a3796c0e0a64c3d858ca038bd4570465d9..d4def83293bace5ed7b4dca1fef3bef569c3f781 100644 GIT binary patch literal 30044 zcmZ^}1ymi+vM;=Gcek+FxVw9BcMa|YcXxujLm;@jy9U=F!QI_mKmPZecklV`dT;jX z>6)tgbyaoO^mO;y;fnGSNbvaZ0000Wt$v+nt)z(RlCl~=7{J}XFbVL4#{pf(oa z)e!3Q8E7o2EC&F1QUCz{!2rPXC(Hj30B~Uf0FDg+0G>1e0LMPFO^NSQ5NxU`WhN&F zp!uX>0Z?G*0EkZt?DGcz!w3A++9w5&1|#@yS{aP$-#Xv`K&T}E^4~hzpY=bN$!GZo z^RF5r2kd`Z%mM#z?VcQn|DpeZ6%XnE{H);YCAFLY02IT23RsC?u_XWiK5nV1>8vRy z%VTV3!(eD)XJpFYZe#xs3c%;i^GVv6IvWDrZLDpbc-;9x|E0n6N&mxU1Ofj`#o3A< zq$#He6tQzO1#&PjF))Dy;DJCOpQDKxkFu!vzu})x{2&WwXL}w-MmIM%1~*m)J4bUy zW^Qh7MkW?U78d$X4SFXJTW3RedRr&be>L)d+7UH%GIq4Iceb>%1^&~np^=@7Gd~FQ zkD>p*{^h5$rP=>jvUU1*Sf2qh{v%;zW?*9czrjr1E&m^|eM&QZnQ(AZRf z`QMQLVEu3H|FWa?|FrWTl7A!l82=gX{}}&&CD(s(KeI~!o{#arGgScIM;-7600;r3 zM1@t|!A^By3{=IM1%4t2L!y7TfP;hUj0nqv6obT6#6-wh?1|~*qrG2O1WVosff7|h zE0nAUmqv?Q7eOM6lU_D^@Yz_++Ow_uc^YB@H!5`BRBA5&la z-OI+mpJGP=+~h#M{e#xAtX~E7k`~hcab!gJOC;UN&mqxb0YVf=x}n5JRc&IR(5_y` z;wUC_P*oYF`jI7V6B|;^Vq`yv;Nfo*L;TTfRg1-CMK9FhRbO}%B#he#B)`?W#Hv51Gq zt)%a}2pv9AUJ(Fq$TWrN#G!zbSe3=KhL6htj&`4~LW+u!MpkHlmdzl(uc-REdgVYu zX^{aH`@d(=3xhSEeq7z?KDtWdKSJ7XkY@Q!aTCv>XDdC*Z|u=OkY^_p~? zace7)k&&lhxPX;2Y*hacL8t&E{WCCO65%-5)-b<~_f{3k(mr{qpld_DUfebanzUEd zx&+iiXcOJqZT=ef_@cM!4oEJ6b#&ahIdxS@TLl*i)Ue4ZaVm$%ArR27hMY>XpvApQ z4cv~K?P0bu%v&1YB28`k?z3;OnL?3*-U9(uQ^HjP-#xIxyrXBFtvS#kD+S@#$p9rH z%A91H5j_v}k^cMG@Q@NUm-TD=anKU04Aek~+eaHRL zG$0pks)J5A85n2^)IxiK$g8MVK@fDxlysu==aN|AB_u?7pxw=BZ?N8y;pZW0B=8_^ z#dkb3TVAGzNS9+?xXBv)qr>7VEPwxcW16^oRdkSHyDQksfkN#dGU0rQt9LO9I5@;F ze;WN2+X2aiycPkNP!T0sg`4OS40``Ap;4q#L^JZ zuh2#X1`yXc-@t>uea(nMRe(KrMSg~&6?77Rgp3gH1Wq{D#fZcsI3;^4F&b@vJSpzlp?G7DWG(HG+$CA_X z4!!p#SvpRg?a({Q2RfHvFoa&it74xLX2IB&h(p0ZhK5ai*Zq#tu=fW{`E6s-4UpX6iUCG`KRt!%Rxt3;2Sh|(+7Zw7_5QqXtW93Vv({isvkbySp`dL z3+EP0zDS(=IVQbTO)p5w?u0PZd|2&$e_5M~PTlC%Gq&U`Y24L{Bu6j~0`hY(T|-ib zV-?LAv{x>gHAC`w*q`Gll+GX~(yB2?9ElUjzoj67rOH$B9{atQBP`hQT$T~gbG*=Q zKF1Hte_eb|UN=4jNC(VxE#1L`OJ{yjYiIR`--%aQ)sy=F}a>lH2U0wpX*f|V&l+1(?) zD@YnLLC<%X9pSHFX;2(6H|>9|7M6(JT|35q-WM01ZQduKRMKFuVOJCe#4AccpaKiQ z!l*Z&=2#QIBs=U_J4)lH34hyvqXwtK;M*R(gF);j!u=rRvbFiydlxldQ?gI{Wr8Yvv%cJ~K=~-fsVOf7|d`HTLyaVxrZ@cDbGDxJ&i7-6-6n zxtgki{e1)%!U#NLcr7uF8KG#Th|2Oau#^>)iKlJOy8$+g%3lg^Nr63|)}hdc5HJ>E zPXt*?7md6mz#*o712yQ7tL)^p5qAr#x=AS&>Ts-+4)i)l1U%=|lo@%J!(Yi;ySO`_ z*6EPmwVHe17ilqeS7LqZ8@g(;@KsDOv0Enza6P;Kq5~`qC@nShD)on+FWZi05&j;} z4ve_Njo8!tAKMyL8>f9+4_eku=1}#@5ho%GJxnS`jF7+fpSN;*uODV?X4;%8e6Dxg zMzzeH-9Q3~{-rON<1;`;!3_bM!3W2+Fb(>d&R@>_6>V()wLKBp;u+-$NC!}~$v0c+Uub&ybZby#3_J$KWdIJ((? z!k`!Qe{ypNd*T7b7|l^T>mK^_&(Y?d~}0Ai}~|yhssP2Th3UZ&(Y$?I%mr6 zu0ud15iO@NBmR}&z`%flp+BApBj}!B40pWE*IhAgu7_+sL%(?x?|^K1!`;2&_HbRT zcQ&fwmykGZytvLyXZvHn#VZW6RzXhApZ7aA!Qrs!YRv-E0%vO_ePko+%_QisgE-iR zF=)u(U~MMNjS#G&W}?iSC!OSv(%Jxw@z1Ouzhfj^N0MYE0Wuvr(-{LmOO=~bgUD6J zVdQr`cOL=?^w&2Gc!Ec+FYFq+H1TL?xcIoI&2&|E;`gLNs~MUonDAe>y+*-VS>5gv zb_QiNzcroJbYJd!UH5(y)YL{k%%Ky_w1O4=f(&jzvuhQwkEZoPZgiZ%|IBQ}v%TqEuWA6Hnq{j8aV;;ec}lFH=#1`LNq zNcmgODc$|$>i$UOS0V|Ho&g*J(g0Jhq8p!5ksw3qz+x zF{WxJhJ>^k{dN4Az0btdvuK#H-DC#lVNpU)K=u3IveFiTos{s$VbOA4%Oww%qh_S* z#b*c*du|a9*ux(L{cJo&k+yzg<@7`oA$4Fo43DDPEaXsVR=ewpKJG_V$ICLJ%-DMt zIiwRJ%nb>W4y%C*5D6TJ#1zxKB#3}e!R_4!!{g=L{x!&Ev4xa#rpALs(|o zF9a~IBY55yZ@yB1+vaQkvi-I360xld|B!|cFiC}#rzdCsHuoD;jxzd!abBnIa%51d zHtSa9M|D!t@$w?>VZG*S|As1sh#ZG`NbUKR7PD~Bpmy~+|A+LdB0l2zjB?=RBrZV` zsi7?ESiAE5qJGw8e8=7PK?=nij}#;<7Z&S`Hoiep-g1~sd;;`hNO)@yBRGApTjAtS zN5yC)+#SKXxo1Hkz~Zm84ow?r1zdZH5_aI5hx5hS9OJ3FwE||lQ$GAl7OVyVgi0Ck z#JnnTOoDQSxch~|a)Cmsw+wd}8#d;I8nrV=kIUsv?8I%1z1ud0U+A1;D|hpmLrQ`S z)CqxA3MK{io+0wa9dZ32f*)o3g&ZYwRvSHMf%OrD1WZ45^3^885qK<aGNNeKChR3+CA3wju?hf;I+a#HD67^pn z7)U{KZou^^4tl8NouKdrF5(7uQzSuqd_NZ!xIm*6X(vyk6I+)F@_9j79O{0k?ch3x zka_FUE%-ula=*7n8yTPYtoS7(#wHDyKf~38VEPlYr2kFA9Yu8(RyKPE;usc-t zwX;t`-2QVxBDa;>XwODdn1P7qiZ)3C@}w8!Mfhse33UyV??Jx^h9YStH+3Ae#-Ql% z4lY51C3f`9`nlpQARf4to zTyt5mh$Nros#?_LSpAkD{6%4#5H)ic9QNCFXy=DWwdB@dPzrLVwlu|fq1K}J&Z>L9W1(l53t>P3+ldlYC`tii+H=hF zEn_0|2Ryj_)^m^qGW1OMkL36XUEO$Gegqvgh+d1s=dtFC--oki+3%~mEbcu|DRHRQ zOkms3eRnD|^}Esf9K$EyX35%IOXeKx-|(r<-BCA6=(+f8sgSr%i`W(w0SP_t{wQrj zqgZPg1r%Xv!NZ%KY?AfeFIxxKpP5pNRA1RpsjG5txD(xkSvr0hWQ}AAz@blqZdrA; zzi+UiJKosty4s}xFld?ik1(KBN%rjP2U%Wx-;3yvI15Lvsz!^?Y-8XkbZXy9(fSxZ zL(%)*{g84()+EMy(JJsKBT-x3{#ncKB2MOgQQ84y(|d=qNwipq&fX@g)-C;L2*u4| z;a|5UBJV1MucO(W!WUU9T#SvR3fCnar2nC5)@`|ZQn!_=n{t29oKVsCbp+DkD4G+w zO?d{8k_SZXRfu?@sLep;3t7PV{Z zQX67EFyGnpN9bkT{5(G3ARFPgM9x&(`~4iE2cC}K%;fJGAV~E)=>;Zgz*0jgeAOL&DWf_rZsU;`wSbno`75nhd9g<9b z5R?-KC;U3DV~vl)N=2jNl(EM0Mb{CJ<#Ivocdw_mE@l_eV5;EVN3%VSR>u+7V}h9` z=<$fn=Y}m6Of`;$hhWhPuA=b*fn!Aoly4+Ro)UkPQoWi}&gltCP4=W9#95ksj`V3y z-T0o`=spbV3)jRd$zgulA{*-}!NLjeg%Pu_)~+xKg|PWc=*#qANMOtQ$omLL74}Mr z0nDn11uY&MnfSY)5fpR}p9yK}=;bxYR&lYSw20e4T3wc$)>rI%L+0mA`u4VV7E)=3 zcS;Kz9UnD(o$f*2ZJa6@LJKHjmR^LeKG>(C=s%FuK7O+_vRMq)f(ggjB`@%>C!b8p zCh*i;#|0&N-|>*SHM0CBA!QwV?=99J?Q9mo&rAoN9nYt{)nHEZjbA$p{h}I2hirsR zrh1;mMzwbud@U`Xdq|YUPg^Nm(Riz(sLEz?7FF^8AZkG%MEu~w&7{`d;O-U`-7Rvq zRpkap7^+$#*zGNrSROJ`X>%1b+V?)kmthZlLUp*_%GfR6`?hy%%KP`Qm0msQ9D0Fr z;3@H>y_|BR7J@q9IJXMTeKAP|SS%JY;yw;~u*UjL1-v(Aojv~XlbBWZT^kICeyT@o ztht=N5~eAhuLnp3i7g$uP}=~B#<6ZdKrv4;EhaPASE^7pMH$alat^s!N6qhh88$66 znOaNDkKW$U0uy}43EkJS1?0r6Pe;Q&WEe70HIZ%b8Ek1UoLVyP6AGNX2|4*0XZp78 zjMwbmDZ7zPUW?YSA9>0fprfzWGl3G2Lob#DO~4$NoG*6$H53=mM?FFG$HJvDzC9L= ze?+#m_2`H5GNvOVlDh6msJgJYEyBoPfF#t6CO?rB~4iKC*N8vL> z=}Cd0CK0OoJ$(myK@kxf^*agII9W@axJiWh3@m4IzfJ z?W8VN#t%b87 zAbO_fTBnHKg%AY_O@tie!;{aetQHxW=%pUHjGJ8JOb`@Jqf(?e7WXJjdfwxILi@O7 zX&zctonw*4sb=l?#?%5T9lLwBa+S<)+q%ai=BHiIb4e%X>qx1)WXk)6rWP*!YJnAy zX@|s767^=7+);Ct(+}nWV$xfbEA?mloB9nG31rQ+A^{ri6B(-kYK6W*6Tj9g1hUo_ zy8M=e3w1UDt}g1i1zfO;Sge-h?j^h*Nb4B!KOH99(WSdsNh&+rj09+_dhSHaS%%aM zW?@jUGr&b!H8c24qwuQYsYc#cGg&I0OW%L+KA>Lu@kgN}RVmn#gs1AQw6)|Ifc(>C zHOLN8UkP`fPpw@+t-kO&@dXn>vH5QbL z%fZmpy4lI4lRDWojySM@7$n&6HsE$NF2P*r-|x%ZQZvOitjVGnUG2BeS$H8I6&n_Q zw}o!I?Xfso=+Mr6I$=aJIj~lltIR;kVya+OW?=U+JtdpWH{x@Y=sO3&kchX+Ar$96rN`dp{ad zT3MnZZR?@T7UfhrP01(u5^0xYvY{pYKsB?hahO$qNvxJlC`~l~rmgo!t5i00Gw+90 zglxdqp2ehUz2yz~*d{ev9D0UEcC}UFX6$;}1a%Q{zQFJb2WPRbRVV`6b?$O@(v%1K zwa)dKKI#N7v?%Kg#cn5g%LeC!3dx+2uY%2Isa`ewR}WEqQo&W(LBFI$+~O&>@AdIu zsnEY7)O?o?0xuCmA)8CBJ!puhP-;~BDFtgPCC`HT?jWnCsA-XXwHC|b-A@aGYot5- z#*ySxgNe1(uBUV0PtGRAWrNt)6|tRX|C=v zd=?`}Vdq{HAT3vCVPRoHwxExsK_(`nvRk>Z6|ozUi;kwWLR(ecLO>NU9$qCW@MxU4 z+6*lN%E4$3kV!~N?CHzja_G=-<*JQ#g1)0hYC*zCsnRlUOaRL=!n#gWQvBM+;hrK0EXJD*rv)g=XKNfsN@oE@>wYdKmY38V1{I!iyE#FNvf9lbmuz zJ~$HF52J^2xy(QGYh35!VZv)~KAUp{pN>eff(W#vSUh}J?uj&2yR4ec3m!KQk`4+IdbCcH zy$Fj;@v3d966=IjQ_?R8`1`%l?_r0p)q1WTyBE}XA0C~=M&&FGF)_+y8UWRd9+>s= zxoo+#;^#0&^X_BK1Xylzf-~7k+xd*B@kd6|uO_XhBbQ(bhN=3nLgjqUVs&jyw_0qE!W@J$c!X_TM3R+& ze2Rv?-n$^M7PUx@;he~+8bPNi`k7jvZ)zI)t?yF=55w%6&(zLh;8odRwnK79nG2P! z;+7DQz8~ZUjcmh^-ZMZFyNH1zZ^@t(q~nt3&rRcDp6UGQljGGyniz6FG)bo4Ae{7G z9#m^>xr-aC6X7iy>_apR9au_jIsy@c_G?2?Nf}%ZcxQ}XlX{Ru7DQfg)yCwXy9wly z8b+K&Nmgx4jEH2wzH{M04+WjdJoy0JvyCikj8r* zUg!jt)HhA{7V0-9&>N164`p0O2T2e0XP5oxV!byx7(<#k9u(vY;hBY1& zvt#8WUNOLxYog>%dbsZ*vhC1z ziVjROj$g!44=OP@IIl{#)sb4H6}chI>$Y2XAkO62K%T^JZa~CAe`BFP^Bm|y zlgQomr{^JZ&E>;X^Oxhy)jaOvGQVmNa33gRqN>4dPQH=R)bNzFwfESZ%!;(ROW;|B z$?A?B!HK1btW>mxc_@}S{^k5~D-}EZ)*P!II*fLO$9ABILhj49P;9aoEqAoDUy)Kf z=4P0y5yv*f^XqK$nNGxM5q*UA0_|tsA=KBq$OHi0-@DkCQm)7Ha4VJDYu5^Ho+$YX zsEwRTLpa%fK2fk3&9z_2FvKnai;pp_@crfjT!C0PL)nuWg5~8wVUn7uaOM z?&Ck{LZsOY&%Rh9KS)bmYD-|aXeBJCDKMqv)hq1m7|}QD%_w}Ir%R}&n5fz1w4<7+ zA=`mXu|KH=t08*d{@d&4kb6aRfOY<8-fUFHN7#`NR8%27sjgJP?{FDbSaW~dbIv{R z=0YM2=npadwl^5>bB40+nNQXbrFM6jeDYH_-d=@=n@U4i*;#|9zTT&5T_2`I@rmNy?HrItYS)u;pE8 zZ^v@1fukB5m+_X&HjVSa^{@WE6|mXM_01x$64q_uK~(nJ&2hH*N`?;+I~iT^gT=-? z!$hdA;cI(DuHN3Jf?Prp)qy4v5DqA&3S?GQ0@m4&NHToxdd|E18u%P0zF5M$ld+fO zuX1|QA_4K}9BL=%IT6nz1;&zzQ*CPc=Ss;Xz&JsuzBLcH!^+kxYAsf8Zs==_H8d$I z8h8!XPftnjNjdUJF{?4OFyAER>h$L>_~78-8+yzt=7=Js-2woAX;5_Iz3YvB_WRa~ zt}pwDF8?L*?A;;eDy&{#SWVu$B03Q-mHt# zO8tg6+L393pk5MKFMkE9+cA65-JM8$V`xA0j(1rhBZLN4ktzq^xRht&c#R&nY-r$t zaedzhhk_U|d;%liJy0nxn7h}30>K1G^({ebkTv^u%7!p3Qz7-@eG7!J5v63paimBM zOmi@W1flVgp@jLdEo|1w?=FY}Vt%;Y;u8gTr&vttRYa`XQ%Rf$bbmxI+{f4AWKtj< z2lJwS;MzI@rkZMD_1KvkLiJwf)kf+r>t^=|qpUaz_9U6EiyrboI#_}2?JT6|htq_p z5Yg$d)*@Jt_&6<5!QVtz4B8K^=_*dH?;O)>4v8B_Q*4Y~8bs|ipw=EVRYRw8%ibp4 zD_sAou4zHqAtIk$mnjqLWB}ZOn^?Sr02*40f=%oWyv^s|sWuvxE8_EtRg!h6xpf`Y zt(7onp^-P^gvrS)P)+_3Fj8q=4oA=4zjJkth*c!M;5lbx{w?|eGKhvuI|}X{uF+xs z$#03LOn~7R|EL1C#ERTXIR$+eq|K%$@_rB1L^0rLabedQO(;g zGCt5IffLIhueW-xYW{8LXs0?mm}*JWDed+KMGMH|`?Tj1;SI#Z^c7pO4Ooq(3cHYc z?&+9(e_*NnZs1NLYi;P?_8I;@VM7QYoz2ouR#f8l;UdH0xgW)iDcYVVxe!~)_(K(_ zf-p07``H90{ilC%Nq&q}i^!+yoo!>PFvSfBeMgsp3^VTM&MI(Q!M>hmwA!oP<7Lt% zAhR*x=7!W%BzB_()3Te-4%G<>P`ivhQ>6IjTahVL5{HVCKWpJy#P-NPMZ@2vMPBX9 zL$&Z*Ohz};PyGv;s=ST&)Lt+uAYh-GN>o^)axtBzSWe%TP78WrBWSWntx(#`Nyj#f`fQ(%aKF%p-DvlS~@Nhk^v~0tUJ1NXIZKlY(_Q2Cw z_L<{Z?3n4$zQ}c7j{GH^hRUSx1ts26Q!O9!A)-`RX>~+CCUC*zuf?2=c6AIN&#iT9 ziArV%%ZJ^QH37xT=6eBqDvpqJ0`5{eL!;@IwS+B47}#A!IQz%t`etUcB013q%GlhD z0l{-A^CmhPURg@!)T)zcS7%=FoSJXMXbL0)J4uc%{?2vd%$HYrz784%`xbCwAK<@BrK?}6O0I#J=6V4FV`3ZF1rjiN7k z!Q5AhQ0>t|tbszwPH%knucsBpI{LM`5b2uwuGV0_+JGyk9wp8qvoXAA(B9G3fuv)x zy0pGI`k?CYv5@xmd~TF=`ZF!Xao0C$l8NJC1>=`7ZX%31NOTp{T59%Rt!33Z9^wQZ z&il$92CK!_Yo`2#eh*EP?onfHaX<^6Ge2ztt%OA#v)5X6@sGM+cv8xuNNR#@W+);i zYUC8E!iD0z4m0H8e1r)!m~n-{ip)=Mj}T)fQ&)D3VjR!49armDsoGbBuhM`nf84jy z=O2;<0i#xhP-$s@fy@XmNG13FV>Fog?{=%Dj*Y+cAt2)9BytWQo&>)95evy&5(Zh} znQ2m{LNnBMonG=0YV*52LARO=q62kmy*XZGTT3K(!>QuM%f;zBeRMF=nO#d>5zl?k zHM(Wmq5Q@|Q?uyTz76rHQ9+>q0(o38376IvKM=W_jXS(+>=@G8&pwuJK&A>Kv4+bE3_?WXrqM{eQ)`uO)4^Ex{L$=3aNn4Y)z}pw|(_} z;-U_`_^W`Q6lZIZ0%v#t@>CDoXhS)ye5dkoWp}*$v$IsfDu=`M#6|hu?bXKEL{`Q~ zkcg+0b-pJ}07I&1&mZ)i5OZ-oU{Kpx9L~Wj`G&Pja?YVPdTd*r})fuo;KIvZG5t>iuVCQab&jLpAKeYI(nlaLvVP4$Rnq9 z4CqF)bngq8fVcaRZTMQL6$?%!`(mKv99WHXjM=O=|MNdeImL{`0X{)c2u2tS| zV>EdD=7qO?i<{LN)$t6K8Ra{w;RT|fyB*Ge+RZNQQmzWmH=f45Y1)^O;dR};y$j#? zM;w7fMn(adg!U$f{c0x)<`Bc8rv$w$Q@i#Qma~Io7S~?<5*x4?{1i5Z>dmjbWfT>O z@D(@+ARVqH94J~1r2j@AL%_6lWuhXb1K1xlQb$6QH)kmAd#6!gqODW z;t@p^%9k)PJuq}+em~N)r;^Qfh_=zf1Lx#J$@4UF0YVjrM_F6lu+rD;9vjAIquOCG z#Jrsosn8Ppy(sl;a@o~u^oR6HYE*4SDMHu8wi@_GD?;+n=F`M3#)-HtqKx4&jobLi z@LHN^^-9ez;Z?$Gu&q1WCPxeGQWO(fGcz_fWJ_mNkBy zWMFfRZHhOryn@mY=DLd@0aQq}f8JuX)fD3;3p$t?ENy-YmP-Y>qp>>4I=>!sTLlFxR$D~QKOE!fbedGw+n@MKL&1*z zT%-((axG_sb;2=jpVF5x$>ASff!_oK%tXrn{f@PBu&J}LB$K1u8=QgglZSG0y?dXq zRHHq`*<;VDwcV;>m6vU)kgyWFwxnP`dhhQv&X_4b$C%2-2g8kPLBL3;3__0)@jgnJ zxqM?7++*P7_Np+O0cuPuL|@|NP0_Wa9(6AQmzp@YFFBwu#ZL`;QR$S-vC7;2ItV68 zie3OCC*x$N3*hh*>`EmUyJ91YBq;Yl&{@~-Xo*BgO#y`oizWBQ!xXz+W!7}NLVxec z(r=VVglDWks#&t!EaX>ryGy35?rrMo$Bp3i=7Zk|D{b>AnY%cyyBX!-SqMW_{0b3b zOf*JLNUok=#S)p_q!tD1d?2lH?jpZA<;dN^L!9f?gz9y}q}26Q=0@Sshg`}CPBw|g z{TWH)rj!U&Ah@!$OjSJ(`N%@0_Hc!u`iNTnd(!V#?gT|fLNk?iLh+Y5DW*0j3#>f0 z#dZv32uQ-U9!>cz2}=XEykoyNX@E6ye2H{m`bja^)vOBAq}Xn8;m~v|9WjhZUlB3Y zHEd^GM;!D=?T1>H(J;80vi1u_{A=(Q+OkPpE+JF_73b)4`z7LM)!PuZ2! zEFQn&f0_I1uGXfnvgQ{yHP#wP_VF_M0+K2k*QY_57*a;0)3t!!<^k*?iQ$h9C3Rl zaMZ+DqLQN^6I0S%pTS+9o7G5ltUM1HMOU2GvxgEL#T`FhHhSXJG#MHR;*Sjo4->S#hY zKgZ{^LvH`j%a0@EZ2*jyCdHktTuA|;_;E;>y|<~Yx9)il@aKbWa~6Tv6ODNK-#o+Y z2_YZO=jGcPqf=Y`Ug=RjT13}}qRw8pes7jKEsy=hxlLM33n?2P&(wK(PsMU%w|)$q zV49)&N)dHRFOt|U^ZU@aU+N^Tm2W;(c;V^xzS3mRiGFWrLtXj>M=()Bc+*YC^H+Ee z30fQsT(ZG&CMMfWnVeo-2)|47o4 zY98C4$1SQwMx4*NnbsgS-v^S9H(Tcy(bq937@Jk>;RiBU&H%vR2us=?jXE|us2j^6 z$ew);%DATe!k|s|-M5zSY98caAj^pNapf}ug7##Z&Ho+lo`n1hyt}qi`pjS6_sGLs z$VVyx58)$erAwbC)~{3a%f}t@)9y@tp@V7x-_;r8HOy+t@6sYZM?(!f_Dwp=zdF_Q z*F}6Bqa_3@EG=$sBY-O(fddUou6wzs-O+VNtqs<~5aSmbyf7=9pXOJ<;QO0RI-;BSZbIw|q`7}or{#Fu1% zX7UaN_`sc#4TIiF`%qdxNkPd;B7p}i9VdKPdLWT@KGUYn^rDaj>$N%YL?W^8lT6jK zQM}GvN0;^UJ&9D57$s?(5bo$#Z>-*{{O8zUe`nccsTDF5)F-~!_e%#@NSL~ zcKtWRUE!?SD9^d{%H6K9$;A6gBQ@l%lp)cGOGRwLDQPl_G27Np)a~jQ7T;wZyvKe~ zkMlC$nfyd!!hpv?9`BZ2>sj=mQ-F)VJJ7(QK~?Sd8w%VOY5|z}jBy4*85)6B_)N~L zbW3$cwH<#wv%XVV*<%I*YR@i{=q+6u1m|1O(FA5vgj>c*Jp|utJVB^p=5JaNr!=~( zAf;WIm=<1kxo1HErK$`*0$4n%2?JJ67;OD3dpc&$wi1fkUrDN&N%cK8ELBP3L&P!3 z105~JIkso`b+}%%lisHy9``|b#~sE<>Vt4Yn}Ybp{%?vwU-^2bpsFg06QDJ9+iXZX z9)&^nA-d7+nk805n&LmclAEfiFZED-K(lU+(Wl45L4qgs?Cr<+@G|}R(WRiG%dOL` z`rgA0BIGV$WwJvRI^L3fGXAZ11YJD{Q#~VqtR)AJq``Wag z?IP|%eP-$xe`Cff+VA)*8o>&6pLb9(D22!E=FV2gb1jhAlHfgZLzk6($#u82K zv%e@8H=O_%b-mcOF`jN}X`J6@)L3i>jkIdv(GjYCXXCHShzFKzXK*Nwq&8ZS>rZ`% z*k&|}>b-lGc63axr4${fdNPEyv1J34Kp|cZ!kF^I^EOOCmBu9G>v;2u1X`G>(y6%OsZyO8=Qx5Y? z4^HcBHHlGps}gNZHUo!cvMB_*lwSvR=85lL0Q8&|n2vYlu%q_At0r>H;FK4+tY{@9-5so z*wL2cz16>g4C3OW;Je@AlHAjz5LnMb1^-jy_)Wu?jI56N&)XYkCO?!F(nBrHYx%-Z9(2MuQY!A%;}=K zl|6p}zJk&;%(Y`oJP8`}&|`|EmXy0Jnm7qEp|9wn&+?+&DU9t&@RAGy<|Oj0r)^~c zdjkp+?=rvo+A>fiP?BKNh#0e2VCu~k!Fqn^x4(RCH4>`pXX`(u4xHHvNYd6-Vw8Ec z$TL_)Ey3`a;*pcTm=pKaY!MCdqS(m1p7SN|}Z@HS~PKbeWoUm^|yL;vT zbAvZ`IbShY(rnSbHZ5?cvn|p1(}D2fu2Gcjf*6cln)h6ULZ=g!LF~GB?*|T^DlkGH zt|y>xy%B+^3|bwZdW_`f^$M~D54k_ga9Q0SEmat4$gDts3`BE7zCH&2yn5AZ)z3?! zYjH&2yG`mtoR@?iox`X7b)au}p3YnuEL~nZ{5ix+)~v2TCbPzQcBGiQJ{^F(?{qPb zp+aoBjyUKW!})Iy5%{LIs;VdOU)DHiovJ86sBWy2FOG6rTa6-)HI_$w^r;CY1=N*$4hyp?z@ivSFA_kS-c16#apgA-BfWFZ$fGY#N7^A)jnj zX}O=l4gxi!!ER}NO!F7b^qw^E0?yQ)Qp_myO!c+ysEV0B7eWd)Mkdh}WB@y+VIBW- z0JzZFPJWi8f5)n3fT~^T_2XJ*JY1mnnUIONVe(?97t>+^NdcM;hrDjLN|D}k!K#<8 zUF*>`*i4xz0_m4ga^@uSDGdlC6M4082NVOX!Bj!IBwor_fQo!rzroOwI^m@psDRQG z)Lq{e)3)`uHs@rVgEnRyza1_(GP*S*P`+Hh)oY& zoa(2$oLD^AFn>((V^6QSi74LD;<(K9I_Xt}+I53(k?td`*dla;(RBC()4o1!tw1(qOHtzjYy? z>GoAAMcfmGo?1(ekC)7!QnRdymud1vb}ebgy)a&*G%-0o_aHCkAaVn)hf?uU6AYBE zEX5{DyR4I>vyThd$b#GiauL&vm9YIZ%c)9FtF>JM=xr(O@ov27o%idlTjqxj`#X6_ zooR=#i<0twGyXz}nLE)<|BXb!?=g!`yH4%6&(7D!x{(iXpv>o_VWV@EOpNNh`wpM- z1%Ho_7+4ibRZ+em?3Q}WTdnfMpSJ-g^t*61C1HRX94IkJ+>&>UTg^(P1b#D%E>y4J zgz4JW%6G%bc##g346E+Y{Tq~+rl@}}tkJ!_{y^6HVC3<-EVG-*+Ga!@T?_u#dP=|3 zH?ia9N!H`zrqCW;Re#yp67*8*YKe2}i)@eN5?c6(tBO7sapqS3P9lg7=vD)IH8)N_ zxhHVhVkEnl-cVh&u@Jw>11dV)xT5A&pp@Ing8+-XCgqS(E3$+ex?S8LUbr3vGJDsI zp+LUd0JLneqiFK>-0FpXk^0$;(QO~gI`_wV7SG*u$CogYnOd8KnH!b+(^Q47ccm|0 zC}wlejRo|oCBK3AzZiS?p_Ayyk{ls+;*_|8l!$lGF8I*_+tf#ZYqfzAVlrdBeZEXQ zuNDW41vnWLsZmcGTWPJUn4O=Wy%scI&I19bT8tJ(9-yQye%^IO6dwLPW_pag%pX=B zu7@k8$C-R=enVPs!oed}C>$_kUq?Pf6J2}SDnGTqh+8kd{$ZWo@%{%jyw9jb>7pkJ z{zR$V73r^s(a+kqgcXcCz(RcTX!R;_Y`hXi;kHMZ5N;yut+@|-bJoT9l&N|MT6|6( zO&mk_I7`e&zCN5k(80g(H3I{%h+-9H>UegYm_!hiB};bMFUJmc+0GrqXxbh=rjaQ* z_IbJmJ8Cgq69Z`^x3<%lU5ZxpExBU1zP;m7UJ4|-zxaD>6Fu?nPUMmntc9&+b&$kWAWqGJ0wqh`_Dc~g zU8V=x$eI#Hw^ey!0(wzHvVHk-%v+;y+F$v#MzKIQo7Bew2F*i!GRqTAeAUzP5tr)QBed>`7ctssRRiA&Usf+qA@nB(IeN-y(!=p2Ft{>2 z#aNkElpdHl|KLlWw$FE`WM!7&T%*E)E3gdFUfaM1Rjyqti8x$6Dl9s% z8-7Tla~8q&v&Q+EiDQs6n*H^$yor^uI>!w2>|BF{G<-v;I>EuY{b^k!DM4z~+stD; z^vM%X8p|tF?!HL#^SmJ>k~0~i3EcmX?LApma{Ov8ot$;Sl6&^Z0X0gLuo8VMQ>R=t zHECGO+}g8J-(d*c>x&dA-KC8{1?4) z++eJFPSRHD(l#~p9wiu7or$N63m4C~N$)*Q?Gs#YMM@&-C}-d2yMMY{2$UU8deZMb z;T8RJ{~*DE4HZw~EjK7RBOl`g_BFA#M9o{h%`O(CvBjfhTg zR3{>Pjrc5q?J(D>W?zjlg4w)~5HmgS^Is^a8;=@w3{his6E0i4M7LgIi<0^@z(52C zrU243+pM$M4fnqrGI}oLotE|7f`S{L`%t#1j_>&qh{0q==wW6BoaQ|b=g+D`%*Dn6 zWB>)Yn*9%4P|j){O~u$>jg(?%AF@2bna*~gQjJl)8J>>*LhY`alR?qM2-?zPtYo5@ zy;VRb8W}@l2Oxah#`w#ATiVU8mo%6u*bRV=+>&U+`DvfeyVcx79vs*)25y3fr@^F! z$DQ)Ls{u}>=9$j8=!qdgp~x;Iz0OMPtK!e-V0UgO20+d;%24iA{ldzYTD)`Y_BpCN ze(qk1loNjze3yj}Z~xqh7{2m-K^|^cH5{n%qAUP|9*Gu)<=Yt>!6l4l&|)l_zQ7gx zi}mF3)s_+r7|)&6W$qi3H|KW4-_6|8u>*l_Wu%7Y{W|^hls2Av_%{u(i0B@m$iv+d zhELCvVRnvbcb*<~m4?h@%BaIrt_%rE5-5itGsSDO{_qQ>Rx{7 z@66#Lujv0T{v-|}t`qYdd6%p&pNULPBzMF!U+fOzhl>_x zd)?)u)0F(i_vUvRd5?zynDf)+R`Rv_5`gM>5C8$lKQjST1PF>_LVci`>=Pzp{gp2B<&hhv^0~8&0VjG=Vh0pjiQokt zh>&U?z(d3IAlOU*$J2%i5J0J#={%079_6IoBN08WC0eD)`aTfT|V87dtUX zu7_~}oz^NJNhEFtM_tX50aY$r0-!yL(Ky9Ik{y0Ffy9ud6cH(h7NXh1-0 zsF>lt!cj>8 zH-Kb3epjl}an_t2%p!PI6Jc)3E3b$OBq=XTci;U^0N%HYG0|&^0C3eJ)mbfbfyjM)?hu~KNV4L`-;p0{NbjK(Oz?9}3%uv$bn5(;c zyi=XQL9n~PG*r2OK#a%oXyKvlomXX#$Ghr=_~xdtZmc)`@nFH7{N}ADzZ5j^F2SB zpG({`ABDV>0OoHhQ^!xH(cLVM2X4^+Jp(zbO6w_|5v|(%vaXu*^Ng=rm^kD3fYnz_ zz)em0UVl@~)NR#Qa~gN=Oy60)rXL?B!ph0jR6yn3`jr5DtF{9ix??m~DKitMDao-M z3D_x0{{;akWOmI64omPNRs!IW&Q7E$gfyU~h40M!O7r*1kg!TI%dniDh_@p01x0Q-Avth)iDsGUJ*d) zY@Bb15Q@X)AuAV9t0=`>dM!J}l3Pz2MTSYrAvl&rk^oE$>8d#Ms+1e!I3{N0)qdX! zl*#?X1=s*o3#ULft`{kXR{SQje}Xz!vPFm&gm*Tof0ae>8pg`-P=U%bN$GB?U`C#k zacJb%;C-7MXs0DWC0x}`D|B=r+@Ncu)5MUJm>?g2{0Us#FPAOr{*u@RbtD>xQd^Zi zT)BOrjwS+PBAZpsfM&pVOzG{5EScR$S9Ugy29@ z8@O~@HXP~#HVQ_S0Bfhk$p;n4sFdT^dqiz2eV8x{LT(}XG zMhUWP`QMV8n=9er(2WsD47IRU|C(#AmgMB#^5I9H$hapSQ{UQj>L~{gXGu2PlY|8E z%rEuY_aCMM!V%|+3UZ||a~~|OhDEA4KSx3%>PrBQA9Xps#`jVb)6B<;*O;hqOj87d znEzC{eE!8$8FbNw&Ni!;Yz!)tdJrn#p4HN*dtXV(%*9!K+>=~^!gUgjyxbF@2j2Qy zHM{y<0(jki-2voA`N@BW1(tC6b<9C@mY+RjvqmZm{lOmC(x{9#2Z{nxA=43J&iwke}3m zzv6;SIeKU-qyY!vBoe&W?e{W&Mx(R}$5~&AP3uX@mQ@P*pMCCm*|TSlTy^CY3i)+= zdDWh(XU?3ZN-4(K3+j+OC#=1Lt41Y*_bG**G2{m8zY-HZ&nVfh;SqTB7{beZe6>|#~)6X@4ufV zP3i>8#=wPI0K|m zo#*RKXEG&EreR0MHLNQKc4o+W5bdaIZ;-$L#`(nm_mDJc(nPh3ZDd=`$p6hZ-^t^T zJtA>!y34^M*~ka=@d2)`aUDdU`-@lQgut7YxQSzc#m$Gxd*xRGU|0JbN+G@j$i8%v zMl^V2$j-==nA*Woke(q)NoU9pKm34oWpFmjNiS0%l7!+^nv0+|tX}?$y!-BZvS;fm z*}icFq(PL_2@k{G8gwv%r+!k0QroPOp8>9aBn32#fi}6jV((^XIPQs@g;|n!Xqyz~ z8YWmhMo|Tsel+3}`I|&Xz~y2Xh;*rR?cQAG&HG8-e{Z66>eR_8+1Nf!0K7j4wdeKM z-;|+4FP0|popFd?w%Cv6h@GdxbKFCi`2u54agRp=nA%%aw|*Z3u&W*2?OmG`0RZV0 z;&_*&R4jHV*UP}tLyyX7IRz5aGDdRYZ}XyyE|TFRN6CHn+$mkUba6@nEft6~(b3T| zX3S{m)2ENTKly+1@=GtEh+t{k=``7&mWh*Bi*a-jqzDVfk49Q@{lbjBz$K~y4l*HP zc?#;AK_G=$`^1``jj=%k2@DQJ9tN>7DPxfM>P6I0QfU*U6T{@mA1H*bTeq%y)|{Sg zU?Ae>VNb!lpMH|due(bM5ATqKw%sKa$X~4M&ErmK5#!uP38tVj@#crcvAwKZ}Fh&p-WGx<99uy7zkDHRBvQx?FT-+*}BouyAjboQ`;isaY@aFMcoU%E7J)kTsz z^iY?-Oich9nOOgxJ$uR*Urdp&zWQ3SaR4=uuu)WPX;e2{@~~Qz z1KQ$zcy&-^0-+on*!#94rb=V5iy}ya+_V(2Sql{6AIdIJ#CO@{SIM7$;+UqV#z|~! ztU`9J<`D5&XSQM6wr%q8!;i`tJ$lGDU(bZ(>M4;JEbmOskRZUJRNyPtoybcfP=jJ` z;ckk@zJqkE&cW|0kk`P_%u*8oJ4A6&eKqIAcMO_ncnje3QGeoRQ_$H5gJY-TWS63h zT#0XUnjD3>^SbM-OPk}sxCle_P_ zQ{|qJ5HIP0b>zsABdX}*n?}>Mgqt8(p=`FJvT@B)iEff4AARz@oPYlLYMjD?HK*5C zBL86=Q1{i>Gvp?yZww^zZ1^4BvNv5qVbW#WkCmXJxc(64O0*f$(;#Ra#a%y`zN!A= zGy0*Tq0GuvG*j1fewP4jlLwr1t(kse2ROO9zSmb1w^4B%?}_{jUKE5BDue(T_NHe` zJ|@Z9bUH&;t=z2;?6%v-%G$MSo!nG&RuTm78#ZhxcilNw)~#D7x7~J|Y~8v|A%Ck@ zt*}W9JGYKA)T>tys#hzRGWW{9ef#C@x8IfxtA15u7Y@={2OU@^a{NLZOgDS>9639= zze4^NNnI6j?>?9*8M%cD`E|Q>+H$YXejtn;aCW(x-=j{70kLk70A3d663ykI+}c$* zg5L>%HTA7+jP35v+-CE4^Ly{_nFG}#>hxRp9+n2pB0vZ|<-K<&O2>{J<;9m?0RrxG zQHzLdL^}SpZQEAfeDh6Nv}lp^?%i9qY}uj;kByD>z7mK@&dosNCgP{1rK!B`z4u<( zv}uzfvikLLS_}}MfZAUS6RYFC7 z%&_J?gZZ7~nFOOg>u)Ne_df)K0SfQaR3N6`3BdPu`2h+hz|pKo^&mIjDm$Rxw@B;tZ-48r(e_n}NF%F9q{ z$-;#T6ms8v_uWb@AtLgRh^dj8_f4BNl}8?VM9s)vbImns=9$P()rN?!LsV3h;_g(j z$jM)N=_N%dmtTIlLVo6@1Br%2V1J|sWME=Og2#JkrDJlC8U{GHGsW_3M@FXS8~#KD}B6Pz4{7?$B!Q`ty{MiUOBT$#MA_!IW-ZM zGqRH>PsYG>wzO%}M&(hbP93%GC_Wxmqx5t|9HU2%Rs-()@4sIfHEQHh&*y;q)?06d zBsv#n&_7jO@^L2F&K=t^xwT6QveSV4#a=I`s5D?zc1C_c7QYgJAKFuCT|h);D9Amc zCZlYHSVQD^mzH)&Au-q3G;7vO?!5C(Wexi7yYH0VPvqBu`H+JLV`jB~|Ncs!fB*gW z<9W#Yt%(!PCrrQ2&x&nxp_o(B#bkXK)QRmEW*lq3<@`^&LH zA^ikl)X1fJ*4#sKG;OC?v60_s2ikJ7i32}LuVZpp_gEEPb(^X?AsAdgUkWl)#a5K- zfQTy+5wZ5}-K!SUk&6?NFTVI<%ne*B7=Cq;}Vhi$C=t2Z@f{?JMTRC-~axn z2E3PDc9}wC7DQq(r{g`BZc};;8#YYUB{nt|b9_Y#@%cjMuRaZ;)KrO%sZ!3k)%C_tNa9xl(6H8opr5PDRUr}(upsxa6lEVUss-dVxTkBaiOdW{ zSRy~omRw5x_19mkzlR@wSgybRdLeP>ctmEME-o%ko_XdO_09ZU$I(QvWXTeF_St6@ z0q|Xm7A@4W2d2@8uk&&Nk~YL zNs}fi9iPa{z;gJf2ewSlKaKc0&c=-!g|`1*y?QC+rv-@VskJ| zLZCDtG2C*?Ey9Vb#fulKj5N}gBnZA|dJ@uv2@}*{`HeT;P(;MpVD4SuN=}~x@ugU-N%P&6#0YPOfeLv?5C)~#wbm}ztzcj?Q1 z#|Gn%Wy3ney$N|4CxJBZ1EQReF+ka!RrOz;+Y@sRi&Y^$%2dcvxezD9a^j1DQlPA$ z34@X$HZ~T^vUe%9h|gMGDl5HL-<;9_D|9FgFg6I~*dXlK-e|=Gyvpp~5$V4}0K{z| zIq{t4!2-1@9ZP+6k1OP;{7%&qUYxk{HgNd_1JiP0Tjd4^krNQG3KeAR7He1?39Zdd z`N5Df)yPtWDpqwkAq3!uc9nvg!S%5{G5d(JldA{{2`ZQCmnyirzR&~@7|=>|6dUtt zDGCY0VSI`<0GAWdQGSNk2_XQ#p&UAyHhp%aaz&B>S0pxUlF>b@k-*PDuU5qo1`bA0 zn6XESLn9>wE?NRB!9|PL?X(+bs`?EtOcnDx0r<0#mViKb>99&6EIigchj-`yGS@eC zyvwa-1e6Al1bJBZq~;B3)CO(Pj7lPLNm2d-lQ>N*{_OSgt7eYjcLE?By@4ecp)qhE zN!KD?gOGq{TKrd=(y8mi_U$ZHcecv?)Q<}U6l@50c+Z227HeK@43e=aKge}czDpVp z59|Ufy=H`jvO_$?D(AJ=onB5pWqfwhXc=)%$frW?=S8Sps|Nh=U}2Y+wVVf4{y0*T`K@UvNv#dR<6#=tHNwaD333n>xfU=ebfPHX8V+pAlsic7(8xZk}QGZ_J8%N28 zH9KU#q{{1WyeZd?x<qtUOJ*Yc>qsm9+op&Z+ zhuTQBEtH7QP#))}YD-nMRM(Y_A}a6W{i$jKaIj;en#Kc6ZD?^=7_0*bL~^bF8nKT3 zYITT!1Wb@2-Y5^RY_AQV;?A zq<*+f!fIN9{C`Nl0fXd)XP;E=C7C1pmHk&%&~ZY+>3_N zF#%y=R0@sxe+MN~PppN#svEO%EkDeiZ=r}M@)Ho6Ta+x4h)qB;P7tQ*FCBV`S`vK6 z9e1exxHfx$T6xLz<#C#ldHegBZSdyL-}UmgJSThsYc9_|ka`H)Ft^G3@4qKsPya$8 zKl_1yWg$Odz`(K2p+jl%%@6ZL+Qmywp;dy>-_D@$#^U`N5@$m1Zw%9e_wl`rX~|l@ zzNlX7PM5=K&=l)S*kVO&Rsg3Vbj3;U&BVbBDbgk(PNK0XoWLbuoIoy765ng!a#vNq ze*NV1(@&QvQ$CSQoY>9|;kp_AdCH?imZu&o00DZgh>VO>Oo%(#=((bQ{{c!>W&aWJ zJ;7i1pC*oli+_`TH~pW?p1Vex#W#@j!Xo% zY6U9pAxV`;faK(H24Pt&3oJD6p)uz23jwI^=d_Xcmi#TP2Hz;tzn!HJS8;j=2QkOj z_XMnpgY=XfbQ!{B$Mnv?_44$%z$L^4w5Lwcd7Zj?2LTO7Udd$Wxq=67eDJ{s^2sNk zl-k4WDOb(h*ng>z98Wy=x}4YNTG@|&jPKk;4gvY;hQ<-FHo_jn2~4cb`$@g4U%yI5 z1kMt5$(3>)HY`x4PcKfmcvO6Vt!N$0WmNd-C6(vSslb6xZ5-jl=QTgAm#eQkPaeKw zjI?eQZ+v%f6C-x%Uo$u1{oJ{8Wx#*|Dk8@Q+(1srz>eehQ=#GzpnDg>kc0CpV32s? zi6_*C9p(|?plaqs1&iO7qx=p7Hv}K^(T7-Ej3yDBU9U^7{8cvtu zt=h!Mk6U)hI}@f!e4_|SN@%HU;)Zp=7~qy{8PoGmcL28^-gMJVO0T4M2JWnii;Gi! zI!mTK^;|Xt2v-K`=!7e`VN$wInKDKBInpj$7)Pr!ye}Da-!Q+u@N4??%WuoAqo0(> zHch2PT$Jp}%~z^{UiQND!0R;9>we7pA|a;k6{Dab21C>I~U-V4Ji} zh?NHQ>p78J$1T}VzSmraE<*bD?W^|DPMtbc?Zu_o=7bU_q&($OGS5@bqTIc3Vvd%p zQKLo)ceL?<5E6wMvE9K+_lH(jXU|jO8E2IQEn2E1b7X1(5%AB}`r?tv;P&qX1O>G%Ha2;1bni67 zlG~FiqH2c8pP1;9ZHMH;7skr4O9n};nl+tjOj$^*JBE!(O-)rB(#MS($4BLzf!-TV z1@{opjD(YR97oX+F9Q*O2oBWXd<@~sc;5Cw%I-b#=EM)>wdX&PhJCw9WKf{&Jz8*V zpnI$;MGz&yV&QnP_{-vez_GatzDQNbzj$$#T7jyX08|sH4q;QcwcD66!P%*~PXX5- z;#yg|700*mjAjxFUQ)rMsWcp>QZ9S=d*KGTWat?(?yj4qV~2JsFEf&>7jAycaen#b z7v+bC8`ZTJHtxIC+{sPy|J*B${M^V-Z`4PR9#xz9@4N3lHUC0@m|^M;KxVE_^Y#yO z=F7!*K-Y(g+pa?sIdC*z3UN=BgZ=DK=8JJ-u>^+%z+1&Ws6uyTE}S-9vjjGSx15N}5oon|;DHBJI_kCDY_9Xr`KcHGp&yO-biT`huIBo5wo9+ndX9$i zyoL4FZ9C+ZcRrAJU;j#C&hIR>z{vLFcv=o}Im~r*9WPdc>!YV__Kzggc5l|)>4%lR z->#jVT7y5j!fyoNXpE&>w=qF0R=lUyI=31;;%Ok?qlz0BS&Lx%GNb__vtpbx;KW%7 zMCYpCHeqF9Cwb+8+ofx#4l29yxDq3o+8*aEk=yQkCfTy_d6cvpKD%1bSk8U&7 z9a_w2G`Rit+m%;4N&*Hm?A(FH%=Nj1!k?TVP*cT8GbZf?<&%aaocN-Hd4%~28URAvFHvy_9Vnlj6s@2kf z{xzu{F=(VPr~S9x>o(G0(7179C)ekh2ITr$HPrC()P6H@Y~8v;o_*t8`RJWl(xP8S z!>xS2!S$Km=tIT(trplf0cc9`4+8>icORKQJyYrW$;mXD_=W46+jK$*z)^t`(qQCc zu=_r42@DFdk)1fC0fFa~20=km1Ai-)ZI<3W+sMlg-Y(rw>!PwVcZj)x=6iFTWy_Yy zU3c9jOP4NH=F*&;9JLf#BZxWOzwJIYHkMX8)hz1ybnZ(Ph`|ga^`gX&!S!kWq!X_~ z`DW%1I1AxraIy$#AJ;(k!L(WA<*dcuh`3iYN_)s#36)0(?QQ9h3%}vvVc;r!Df@+qcQJ!-v7?tg+^t z9YikXDeyZ35uFodoXs3NcC30wlPRSE_TFrXy8uiGFgt{Emrb*|%!bgm!me-Z2C~aLp~1lqrp}v^tGNDR zaDDMV*EhG%b8F49s{YOyJF=w z>CvgBy!zm6a@HBAtD2fS%2UoB_?K8gV@3aWcKHO$J{&mM+M7uGg3`V^ms71I;NU_44>j6Xm-pizU8)M>UI|R#>Daw8-imfp=KuOD|iHDq?x%NObWd zSQ=AkaQ(%c)H)Vhr#eB`>O}ws5Guihce;u^}wkBpT*=bWW-V+V2{YZ;Q=rIUFiDy6}!S+mqKYpyP( zmE>d$`rHE6{i&y(Qe0oVXfz|XySACReojuVeD>wHa`O$~`lmIK_BgZnfYSBB;>q>V z8IJxS*AIe)2lH7)Ycbxv<;YK;{jTzd8C)D@aChaMsCU(o01Q|d(tuV0I~UbqEzw&T zAP>M*Ndu@whFKmdf{!EH*J|D4Q#KLF{EMLeyI7>NyEHv zMgIiDkUt2z<6ICahPp$-&^(JpRGUzJ3BUl6i!^9?(TJO2Kzj*GtZUmK4UA!% z8Xu_Glm?+#UK<$}B7d#jDfJpf$+Rc$mUDaeGQRsb(BoNL)R%ea1=qn7!K&kSMUD7u zqU7z|(38cZAWz64z z>tnCv-vPFOo6{FfU8V~3(`>FPtAalz+ET$>syZ2^fn#VCcp_FQ#w9+SUEHmqGsu2LodNzF4M>q&9a?6c1~|Jnaz|NH;HeejTEFL+bNrp*w! zrmMuZZ7l^bDJfJq@ZvB3P?USxwDmpzhB=ygC_i_-I@1WGzIsk;)2J9gJ+rP22Hg|s zG>*+e6yadno`%Oum#OV@R@JH2+ky!WLyGNucaw~na)+efd#7~jaItwFlKfO=fvycW zKeuSnB9#=vi(@`MAbG_VCKQP-(W`qlz0TVItdq1y zvSj^xU&*BjJ>)37I|m?FLckbSr>D3iIVtsN>*J*zQx2}mMeGC0S#IlBM}x-40M?uM zYlC5TK;qAUHi!gz8O>CRD$X6KLbc5anzk^DKHISu-_0qD9~&d_{ck?4fG-X%r%#`* z2A3Rga%Fg?W}fL{cYOYGaBI71XU&?WV)pqQHI@1{)0dPWH00%6nJ{*i$aOvB(y&&L z`sFILM0I>$@2Avv1Urx;;B~t{av08(10UpUQSYp|ug^3M>bw0$%K$#Ya|hmr?ukR& zL;TOctYfmR4RFR6T*vQdYJ(6d2Ylc5C7UGefg~B1a+gF$bz&=icCb27Wm8ofFn)&1 zqO#Bop-Dve6T72Uxx8NfWj2tza7ZW%~WR^snl1`ZfzPj16VKNrwtO4lhDy; zLK}3z*nq;HpRsamfdA;bJ`agE>%J`jcA;C|d^}weAg=^V*~ z48!B`3$id3f4O9%j7m?G2S<;RE-}%nIo&>12dZpzZNR#Dd3lnUn5b$| z8xTE#0d{%^uF^z}|D_9-CJWIRl9H0tq!w3{XQ|C|vi|CiK9SoeJuOAsilo;yF;WVt zUk0gfhAgv&DoW~$OG*7B7y(Z#$zM2IooND6-+D*&({vcXdMiEgCMJG(JfNr$P7xG^ zv4Kw;*aRRG(7Vwd+TgQ2SS51*amjgVj3mVmfR~3UfDDR*xZ|dpqIW>f5oBg&s=+9$ zM@O4cm$hXAy}s_@$5iHMV_C8$=8q92$Bo zBmg|A__;l1!B-k4LEWP~<^XaP00zOz}9Q*V{Ye*r{G_pa*A^-g{=Qs~-%`&A)Y z7+A+NOub{DtR4Lr!;^VmYLa?^BANJ%NWmfL)i+8GB17vKq`nPBVs(BMqXTmXetpkD zu>N@1`Y)dNtJsTpl$mD@!smOfO;Q&YhLJQYXH?Z!-r`7Z|x|dw2*%GI;P{ zHSa)%V7;0?1EEN=o_|UHF)dpnk)GHNOkQvtwmu|)UhwJ_h?fe6ON_4MMR1a(?O&0* zS13rfY;iONTVH=)eczuF`kAJ(uW5s>14p0Zc9dkG2i?y;z@&V9e4nR?uS{(~z%C|y zY+1ZThK#yirjGfu#P+;IQKUB7fETP>xl#xrNDaV2ER)Wem1cK%)u^>$Dk7s&nZwLD zqzMxysB3t-Huv?w%k_T#2NM4PlG2Zykhse_%6?2}olp#9-Hal@gd#!yU5E>A{Suec zIi_eu?knnyDfJt6fry0h=gyU>Q>Uu-t7-$zDl-qkmii367&K^*2@Y0EssG*X zJ(3A1lB~by$i=tzkuc;`?>m0N=i!0TXB+$#SXcy{4ykV#xsD3=nBvuM>{ChUd`bqg zHG|S@7{L0L4&s)sMp!N(T1xf;qQ&d|2}Fx298HLp*62XNFby^@-YN+reks$&r%0c; znt5P&PhE1aS+hn)jT)u$a2d45)nGVzMbtHBr=_H%sHAibo~vG~4piCDEaM+NL=~9n zUb<|B{BA6yzRM|bms}*pkopRW1Yeo&WWJZ7;@Keeo#Jut#VmR{T$?5oNi&i9IvJZA z1E_bu!NMuxKhcTFR~a6R4O~Hen1+c?qGH-@BKgJ!fN1%=q*Uai-7@EebQykolC(w` z0y`i}P3>};H`E4`CQY($p?(=+K{x8@K#}a+wOcY~{!`{0-yrpztY0xN&nu4B4MRwBK0kLsGnxZ02Tp!+JOH@-!|-Fui`UIfzRP4dzR_gxcIZM=;W+?U5dt004%m5|{wCKgC z#Nofdq}e>o8}#Lhh%hiPHn1RC(3|WI)gCcL8vxNFA7U1GPO98F><<#&CQNAqr7EBWI9#yfz-5?ceN_C$i?rOSAK&%fMdB#@`Q=2AAdGv7VJbW>Px5 zM~2~@?+J2^J@DRZWy1IVnV?8)TAU5P0R}L?f?ra8)NR8i!yELl!{u}-_(mptz{+87 zBEguHPi=56wZWQi5^e@q^Nbi}r*)!KAy{W0`?y^U>O zZkK5@o|l}t`O-Z;j;WytS=Lrn;f3b7*zE|ip-7BpF|k_-MG}zuI(aR9>368v|KqO> z;*-Z=rSr#-hS|dFMu0ps?9B)YpEhWPr2JNxJNV+iJEV8-D49Lwe!2GQnnSdljN z`WPZi-*|hej2sTBe`SR9zVJLH^??vzO8v?PRC*#4RS?Fo4QP33h0AkRsnY?eUr8lx z&lfYV9c%Wx1qGjZl|OvzRw?~@)7r2eaj!cJZx3fsNH=&-U`_kRnJR9cNN7jkhKz}d zkZ<5~pEvnm5)|seYBw=RSZSr`NlAVxPmxY^Tk5kup-56EO_z*^UXx4WdmyJFM2g|z zF~-#8D!<~Q$bZSERX@iaf zhNi&6$iN^uRA~cd)|lCXsucz13vCdFK)Ut++bJCZ5Su-DwA^$<&64tUsn4YJ1#b|F zWTwc~7fA2(+e#szNZ5;cQx^7m6y>&cQtDS!Yy?N%SF}8LtvYj-LXoIvJ4-eJ2B^u0 zFf*Rm-@H+YBRV4^YA)stZXsZo!x|e{ol&($muUpX2AqK{Uh#!YdHh~Uy=R0(o*$vk z+Y(;CwSH{F#?3MwN$D$JUnjAHVv&^Yky2P+cA|=P9rO;QJ^`qpT~IP$?o5PgXjDn* zIt_k~{Qv{hoa%nspwn$b|AZd+B<2mmJRBRK=bHWAcwUBajE%C|jSQ8C5=%lKRTlce~ev2~!I5 z7k;W5)&}x5f3@aYGcX3I#kcn#8*E;8UiY{hOrFIccD^SE#_+V;b5P1gr%6-_z}V;u z+aZ%&zJh(P?EO?U z@bEzDGdbPxPAgoIJM8GrE#I5A{w_n#z*@SsAW@^MT1nbQ_vk zp;8RfiP!Su*ll2k%!Q3IX;Fze+-vG-Ase3iCO@IOF@&)oYD|Fu1JsyQ)mPL*tuZzj z;_(Zf{p2^2-&TR)YKtp|fT^sbjyPaW_Ut>5t zQLud~9Kk`G_g>lcQ!ooVdqCdL9kRd-F`c%3K^&;WHJ-)UZK sV-?8_Rs*X6+Ie=Zo_LyX*61$#KL_iQw)Bs0oB#j-07*qoM6N<$f`Y6+*#H0l literal 5347 zcmZWtbyO6NvR-oO24RV%BvuJ&=?+<7=`LvyB&A_#M7mSDYw1v6DJkiYl9XjT!%$dLEBTQ8R9|wd3008in6lFF3GV-6mLi?MoP_y~}QUnaDCHI#t z7w^m$@6DI)|C8_jrT?q=f8D?0AM?L)Z}xAo^e^W>t$*Y0KlT5=@bBjT9kxb%-KNdk zeOS1tKO#ChhG7%{ApNBzE2ZVNcxbrin#E1TiAw#BlUhXllzhN$qWez5l;h+t^q#Eav8PhR2|T}y5kkflaK`ba-eoE+Z2q@o6P$)=&` z+(8}+-McnNO>e#$Rr{32ngsZIAX>GH??tqgwUuUz6kjns|LjsB37zUEWd|(&O!)DY zQLrq%Y>)Y8G`yYbYCx&aVHi@-vZ3|ebG!f$sTQqMgi0hWRJ^Wc+Ibv!udh_r%2|U) zPi|E^PK?UE!>_4`f`1k4hqqj_$+d!EB_#IYt;f9)fBOumGNyglU(ofY`yHq4Y?B%- zp&G!MRY<~ajTgIHErMe(Z8JG*;D-PJhd@RX@QatggM7+G(Lz8eZ;73)72Hfx5KDOE zkT(m}i2;@X2AT5fW?qVp?@WgN$aT+f_6eo?IsLh;jscNRp|8H}Z9p_UBO^SJXpZew zEK8fz|0Th%(Wr|KZBGTM4yxkA5CFdAj8=QSrT$fKW#tweUFqr0TZ9D~a5lF{)%-tTGMK^2tz(y2v$i%V8XAxIywrZCp=)83p(zIk6@S5AWl|Oa2hF`~~^W zI;KeOSkw1O#TiQ8;U7OPXjZM|KrnN}9arP)m0v$c|L)lF`j_rpG(zW1Qjv$=^|p*f z>)Na{D&>n`jOWMwB^TM}slgTEcjxTlUby89j1)|6ydRfWERn3|7Zd2&e7?!K&5G$x z`5U3uFtn4~SZq|LjFVrz$3iln-+ucY4q$BC{CSm7Xe5c1J<=%Oagztj{ifpaZk_bQ z9Sb-LaQMKp-qJA*bP6DzgE3`}*i1o3GKmo2pn@dj0;He}F=BgINo};6gQF8!n0ULZ zL>kC0nPSFzlcB7p41doao2F7%6IUTi_+!L`MM4o*#Y#0v~WiO8uSeAUNp=vA2KaR&=jNR2iVwG>7t%sG2x_~yXzY)7K& zk3p+O0AFZ1eu^T3s};B%6TpJ6h-Y%B^*zT&SN7C=N;g|#dGIVMSOru3iv^SvO>h4M=t-N1GSLLDqVTcgurco6)3&XpU!FP6Hlrmj}f$ zp95;b)>M~`kxuZF3r~a!rMf4|&1=uMG$;h^g=Kl;H&Np-(pFT9FF@++MMEx3RBsK?AU0fPk-#mdR)Wdkj)`>ZMl#^<80kM87VvsI3r_c@_vX=fdQ`_9-d(xiI z4K;1y1TiPj_RPh*SpDI7U~^QQ?%0&!$Sh#?x_@;ag)P}ZkAik{_WPB4rHyW#%>|Gs zdbhyt=qQPA7`?h2_8T;-E6HI#im9K>au*(j4;kzwMSLgo6u*}-K`$_Gzgu&XE)udQ zmQ72^eZd|vzI)~!20JV-v-T|<4@7ruqrj|o4=JJPlybwMg;M$Ud7>h6g()CT@wXm` zbq=A(t;RJ^{Xxi*Ff~!|3!-l_PS{AyNAU~t{h;(N(PXMEf^R(B+ZVX3 z8y0;0A8hJYp@g+c*`>eTA|3Tgv9U8#BDTO9@a@gVMDxr(fVaEqL1tl?md{v^j8aUv zm&%PX4^|rX|?E4^CkplWWNv*OKM>DxPa z!RJ)U^0-WJMi)Ksc!^ixOtw^egoAZZ2Cg;X7(5xZG7yL_;UJ#yp*ZD-;I^Z9qkP`} zwCTs0*%rIVF1sgLervtnUo&brwz?6?PXRuOCS*JI-WL6GKy7-~yi0giTEMmDs_-UX zo=+nFrW_EfTg>oY72_4Z0*uG>MnXP=c0VpT&*|rvv1iStW;*^={rP1y?Hv+6R6bxFMkxpWkJ>m7Ba{>zc_q zEefC3jsXdyS5??Mz7IET$Kft|EMNJIv7Ny8ZOcKnzf`K5Cd)&`-fTY#W&jnV0l2vt z?Gqhic}l}mCv1yUEy$%DP}4AN;36$=7aNI^*AzV(eYGeJ(Px-j<^gSDp5dBAv2#?; zcMXv#aj>%;MiG^q^$0MSg-(uTl!xm49dH!{X0){Ew7ThWV~Gtj7h%ZD zVN-R-^7Cf0VH!8O)uUHPL2mO2tmE*cecwQv_5CzWeh)ykX8r5Hi`ehYo)d{Jnh&3p z9ndXT$OW51#H5cFKa76c<%nNkP~FU93b5h-|Cb}ScHs@4Q#|}byWg;KDMJ#|l zE=MKD*F@HDBcX@~QJH%56eh~jfPO-uKm}~t7VkHxHT;)4sd+?Wc4* z>CyR*{w@4(gnYRdFq=^(#-ytb^5ESD?x<0Skhb%Pt?npNW1m+Nv`tr9+qN<3H1f<% zZvNEqyK5FgPsQ`QIu9P0x_}wJR~^CotL|n zk?dn;tLRw9jJTur4uWoX6iMm914f0AJfB@C74a;_qRrAP4E7l890P&{v<}>_&GLrW z)klculcg`?zJO~4;BBAa=POU%aN|pmZJn2{hA!d!*lwO%YSIzv8bTJ}=nhC^n}g(ld^rn#kq9Z3)z`k9lvV>y#!F4e{5c$tnr9M{V)0m(Z< z#88vX6-AW7T2UUwW`g<;8I$Jb!R%z@rCcGT)-2k7&x9kZZT66}Ztid~6t0jKb&9mm zpa}LCb`bz`{MzpZR#E*QuBiZXI#<`5qxx=&LMr-UUf~@dRk}YI2hbMsAMWOmDzYtm zjof16D=mc`^B$+_bCG$$@R0t;e?~UkF?7<(vkb70*EQB1rfUWXh$j)R2)+dNAH5%R zEBs^?N;UMdy}V};59Gu#0$q53$}|+q7CIGg_w_WlvE}AdqoS<7DY1LWS9?TrfmcvT zaypmplwn=P4;a8-%l^e?f`OpGb}%(_mFsL&GywhyN(-VROj`4~V~9bGv%UhcA|YW% zs{;nh@aDX11y^HOFXB$a7#Sr3cEtNd4eLm@Y#fc&j)TGvbbMwze zXtekX_wJqxe4NhuW$r}cNy|L{V=t#$%SuWEW)YZTH|!iT79k#?632OFse{+BT_gau zJwQcbH{b}dzKO?^dV&3nTILYlGw{27UJ72ZN){BILd_HV_s$WfI2DC<9LIHFmtyw? zQ;?MuK7g%Ym+4e^W#5}WDLpko%jPOC=aN)3!=8)s#Rnercak&b3ESRX3z{xfKBF8L z5%CGkFmGO@x?_mPGlpEej!3!AMddChabyf~nJNZxx!D&{@xEb!TDyvqSj%Y5@A{}9 zRzoBn0?x}=krh{ok3Nn%e)#~uh;6jpezhA)ySb^b#E>73e*frBFu6IZ^D7Ii&rsiU z%jzygxT-n*joJpY4o&8UXr2s%j^Q{?e-voloX`4DQyEK+DmrZh8A$)iWL#NO9+Y@!sO2f@rI!@jN@>HOA< z?q2l{^%mY*PNx2FoX+A7X3N}(RV$B`g&N=e0uvAvEN1W^{*W?zT1i#fxuw10%~))J zjx#gxoVlXREWZf4hRkgdHx5V_S*;p-y%JtGgQ4}lnA~MBz-AFdxUxU1RIT$`sal|X zPB6sEVRjGbXIP0U+?rT|y5+ev&OMX*5C$n2SBPZr`jqzrmpVrNciR0e*Wm?fK6DY& zl(XQZ60yWXV-|Ps!A{EF;=_z(YAF=T(-MkJXUoX zI{UMQDAV2}Ya?EisdEW;@pE6dt;j0fg5oT2dxCi{wqWJ<)|SR6fxX~5CzblPGr8cb zUBVJ2CQd~3L?7yfTpLNbt)He1D>*KXI^GK%<`bq^cUq$Q@uJifG>p3LU(!H=C)aEL zenk7pVg}0{dKU}&l)Y2Y2eFMdS(JS0}oZUuVaf2+K*YFNGHB`^YGcIpnBlMhO7d4@vV zv(@N}(k#REdul8~fP+^F@ky*wt@~&|(&&meNO>rKDEnB{ykAZ}k>e@lad7to>Ao$B zz<1(L=#J*u4_LB=8w+*{KFK^u00NAmeNN7pr+Pf+N*Zl^dO{LM-hMHyP6N!~`24jd zXYP|Ze;dRXKdF2iJG$U{k=S86l@pytLx}$JFFs8e)*Vi?aVBtGJ3JZUj!~c{(rw5>vuRF$`^p!P8w1B=O!skwkO5yd4_XuG^QVF z`-r5K7(IPSiKQ2|U9+`@Js!g6sfJwAHVd|s?|mnC*q zp|B|z)(8+mxXyxQ{8Pg3F4|tdpgZZSoU4P&9I8)nHo1@)9_9u&NcT^FI)6|hsAZFk zZ+arl&@*>RXBf-OZxhZerOr&dN5LW9@gV=oGFbK*J+m#R-|e6(Loz(;g@T^*oO)0R zN`N=X46b{7yk5FZGr#5&n1!-@j@g02g|X>MOpF3#IjZ_4wg{dX+G9eqS+Es9@6nC7 zD9$NuVJI}6ZlwtUm5cCAiYv0(Yi{%eH+}t)!E^>^KxB5^L~a`4%1~5q6h>d;paC9c zTj0wTCKrhWf+F#5>EgX`sl%POl?oyCq0(w0xoL?L%)|Q7d|Hl92rUYAU#lc**I&^6p=4lNQPa0 znQ|A~i0ip@`B=FW-Q;zh?-wF;Wl5!+q3GXDu-x&}$gUO)NoO7^$BeEIrd~1Dh{Tr` z8s<(Bn@gZ(mkIGnmYh_ehXnq78QL$pNDi)|QcT*|GtS%nz1uKE+E{7jdEBp%h0}%r zD2|KmYGiPa4;md-t_m5YDz#c*oV_FqXd85d@eub?9N61QuYcb3CnVWpM(D-^|CmkL z(F}L&N7qhL2PCq)fRh}XO@U`Yn<?TNGR4L(mF7#4u29{i~@k;pLsgl({YW5`Mo+p=zZn3L*4{JU;++dG9 X@eDJUQo;Ye2mwlRs4zN`O$LcPY}P6Y0H$-g{9&n$kh(y>|o@5RfWe zflx)7)Hj@Sp68tRT<`biORijdGBaz+-fL#v>t3^?)K%r7gmi=e008<-0jUW9fUrpr z00PEdj=Uz%u@_uxDHSOIpeljr(hLv#A8w(bsR97_umS++Pypbz3VIU&aOVdAFs1;2 z=o_2g4 z8#gy6Q9eFTPfuRYN4$kKk1qbN+7kNUUs6IHkMvUJcx+M+C9x!%r3 z@OC!;&yxR^lyP)$bkTA$v#^wS_%D(F$olWv|L8~m-~9*){a^k3N6Eh=#rSTY_kTS9 ze=65Ma zsTS@#2JsF}hR(^pH~2S{`x~AQJe!6ecxRpMx$rctQ?SKPWC=&byfjRA(2q0cgO3MY z`RCwC1>Hx0OmRNXCcr^0HIHhRvfdVrXhp6wktaeFaPNaCfKg~9f*Jn*^KWM<5>Y;6 zJglmOZR;%@zIkZr|2yr%s?A(r`_E_ED7qsJDN#Z)4!P*f%RHwUWLIRC-<7I0`ufDN z#|H5wpIl!T+qa$(L;-B--o`g;7=>tt94zSC%nSByxf${()`}^Ht4a;AQfLVAHY0uz zz3C;AV&UXyJG*|Xn0G2ob)`P#y$?M;{=50ZlchM~F7PvPeoQMj8@b+t+cwi$*=JRv z5MT&|EWlrF=_R4iMDD(&^Q092uQds};0n@?r%mzU{v_Xf zjpI>dbB=6Cm*k1o4WR8<_Z|6QF08s3d$SUUu}#BYFU9XjNhEPwd7<94`fx5FJUN!-as zng-xDyHgwaSogkX_+GZEufA`TFPDdT?rsYBmAz)0=ww6DExlR!1kn*3zYJTm$TNw* zI*B7cim3x0-YSeP2dG(R&T$rn)f%+W!Bf%Lv`i)V@peaf;vr$^A&>g}Qvk`YlkUhs zkRuuuQj9lGP*?ii;o;phkfG7`)o%Vl*QSXAK*9r>q-92WMnU&^XIeEy9FuPx~crH?&8@0=5oTjdCWj~ z-TmiRq1pTCd_hxiz+n**K24rThkg+#QyoYVv;~3KiP>^4Ni;tz(lW+-7=UMZoN2Z! z#Bn?jDY?X4TS3Ri-xeNrY3>W6rFy?-&`9#?34u}G$Gn^C(6@5M9VP-mqAuMl4ZHf3-@fOpE9bLudZj$IcKvhUm6F=mTh~my!pBxeW#lwt5 zLc;2R9YNd6lJN=h0S~W-RHsE*nqiZYVCF8>IIPN&=A;|^$ zw3FQLd1#0vW>;fxEKNw7J>9a7lrC`zyT=9fNg)IPF(DtyWtl70aclY8y$h3D{rVd3 z;ofkJ4avY?)Avt9co1`%f#&kMXyEcMatGS4QAnm4t|O$*Z;QN5z+}aicIi0z;iEk}JoWD95AhZRT-;*1t3oM6{G2_bR zGTsSzUwm}aFMIZ*3UTvhMBvR;OPZ%5E=@Fg62MI*<=ugVKgFfO_$+Fp9zWp%`-D^d z<}96!NNfqq%}|(l+K?95tkNG0{KET18BhA+@Fs({{eCi{+~*a5F>RU-{|%b;a?VEm zvqQYWMH{M)p*bL{rH9e`)|3oKb|#NpSnhNhk&)G+GTX~1jdCHD6{=xFCIt~> z{SCCfSBd9OrkYNa>5}>z!ZiC|8UU<_-BM~KK`Cu7QQUGD25nJkA=GHBp3ubyJU(}2 z$8Uc`PSCya>FmRdx$QZ*(o)U$_k6 ztrSF~=yt}ykKyPsFb)Te>*T`5#I(1Kf*lS6oW0{2ZaPg6{)CI|d{Uhf?sp-Ok&(~q znr8JokiBu)K3TLg!0=Kw@$xvgnPX#xiu?;jn{Bl?=S0_0{1t&`y-;aUf(MhcX=uPf zipf~cryo=nri9Foe5w;qtbb&BF)zleFVhv11mXORDuhCOZk9P7l1{)Jk=1@Ly)$q3 za8{5eP9m#%^&2Ey33p~}^aaZ8Nzd7?NFHZJF#A53ha0f6)^?3Qf8_lB&6s+!7Qyq0 zzjS3mJOt1c$kfxUN&j*s|N0C1-Gy%)26PytVMu1V4-WJM8u^=m(VQgDcn1Iew{15A zMeax2i~4M=N&~e(K^0goXC~C=!z?}I^lxY(G;t-AQYsC*9mK?Tc2B>S-AgQ2Wv;X$ zGc8%tGqT}Fo@m`af6TpdDmBO(IwUD~W-Vbu$ozKoXr#sZ=GSaoCLW~2$U2vYC(do< zh{=8FByOF1kkUuoGsF|n{a65_v2=Zvvt<%+=K8xx%`uUU{(Uh?#`$N_+TgpFLS>^h zs`pEX>w&^gLs^%Y^KLexDi5Ek|NcswMdO>Sb}0`>VN4N837>?woWuMK>){`I=^|hL zW9k{89=DZ-p$@NLonSGTA1r=~-`g81S%^nQL@d*(mKtB2=S^5V24+8Qw5Tyg(tDV= zIlZ^qp#56{!O)g zBb}SS_>x5?flfNOc~CN+dwoa0w@z7YKj*Wh93dn5>+!CH}#?N?{y1`JU&9yN2?L`@9ou5Q-6th;wuJEjijX2Po@Po}%U*AAcc_$NK2_ z1UepR4SGtusY2&|AH_U@zak7NwZo%Sj}d&j(wu1hC$cN*@Z@2$s}A<51#_z@;tI&X z`>Ip)A5_;OP4BVf{yl#FvqdQ^N$Z4Dt!vN`20Eug1WKs4uw$=HQbxc0fPH+&@rtDuX0a|n zEs&5HPlIui&h_523RdyAt?8fN@2~(ja4m7aCwB}qIm*kdb`W@0#6clXis}huyQ^#MCn{pKXI`L``PvHtCb#~8rHAk>bsbx7<6-~x9mcV6} zX28X@`By%^Wb~b+T-KLYE0r>s(d08%t_&1<7O7)dPziO?3YwB`6Av+WZ~*y{5Wf7X zrM&Ow)6qzMl3Ka!2yLYlEEN^YSmfu|7<()yekTv=#nqPak^_snOzE&@@(noRl8Ib_ z2q^XAM|i2>6>nQyU}EbARqQ3F)LC4ync_6V2V*4}t|xT#v56Blg%&LhC9gZ& z(&Hs+Q-&0yL%VWE6)`z~kIF?%TY^BeB=v&dH7r!{@Y(_}yZ6Yt$Q*DwO>L3_oY&r6 zBEpHk#;&IDSI_4gL`t|r%p9r=kEHCCOgc`NuK{frd$ERA5>Z%KlL#3>VlC|>@}!Oj zpb~P^?g}oB-AbBJcS9;Ozq{HeBAP6jxzJIZ2g)3Gtd+7fe-y>YeU5Frrct~H9|tQ?qiW4(c0E`na1QO483y)==z~b1V%95tS#lE@+P0LLZ9e?pLM!R1y^`MUvssq*(lY$Ys%=s6`#(Ol zpC7Tf*(|BLq*3u^Oy-nYk3cFf;PyCk8JOaN_GrH~Af{zLK^+IeR&I-9?)m7qs&uE8 zj`;Q^{b_%hRRtlNJr31<+Uk<@$j_7v2eYDAlYwda@!YfV`bJ4?_d)hFz|Vm4+2>r0 z4oPHV`PbcZJSWNM89M4Au3YZK^AE6tI~wk6AO05DaI1#>g4=NH`8j25WAAXpq+{q% z2aes@MvicSv2ftK-1IwwWtQUw*Z3dRZu*r0fg`Lb#ZGC_U%!Jnc`G#o)p8Dq<3gap ztZ=62W$7GWpY4eN=PA);N2!F;@jWZq@jDqj=7Z0t1*JzN?}jHW*TU9GZxv7UR`KE^ zE+704JTu6tA_V$pqm?h7H+tdvURfsc7406gEvBAjm1r9v`DLZX48Eix?hq#tfxX#C zcXUmf&uGP4JzLm0TIye$^JG<0C8}?jI05auD z@wdaDW05aK(huVZLIlxs%XwoYOHxxuw)^O%@s$ukL^x$Osg1Y)4`%pXwEG3-XUJ#} zAX4%5qW8;)o`RY^X@vzoi#wOLsgCkP#xVBz>f#Ii!|5`OY8Ep!LmjLmbQ$Zjl4LQt z^NCtlZ=LVN_=eu=ejAlFt|<)h^OO1b>>|>e1|P~4o$F~HJEL#ndM9=%i&Zw^&ge4V z^oSh9%C7|384?$*u~9ugWv1laQ+awK=O#M`eu*yPZdN5rB<>XT9zT3IA%|ef-}z|6 z>uDIPC^R_q+|0>?Rx^+wvIAK6QucDq5cuZ=L=`(M9iv)k^CntFSJdA$!RPOmd|#0G zJd!*=0O$Q7PXku#mJB4}KHGWFL01F}%@g%CUG&K&)XdNn5h`2+@B>c-Tx&zW1ZLj_ z_z6!|8vxGc4r+`Z(jC`Z(Y?#fAb4x%6+mVgbsPrx6(x4-{KL^rV+F%YCM!-ewxo~) zS==KLK(J$mq#^Yy+hUCzVduz`y*b5#zOZf^(`%MfQV-Id zT}JJt>*M_`U&*t(Mom*Jm!_HBcO7)@cP3ro5Rs$(5hkg-+a2I3|ClEc>abS_e?KTNb0H!kAujY76 zkAvjrEKl1NaSud@N^bg!zD_LO%S4(%z(O&cjg9 zpa+bWJj`PsT^X2K`ihrI?hH++pEAB=ku1K5<7Vl%{cJAgJ=E+3+)aj(9A{>1PYtLt zm{zHd9Zx;_T0P3k;bGz zLRsA3H~tPpttH4!!#skY2{Ij|st6eT58((wVQXXhts7f+>G8AXOI+vYl)`Aa5A)?h zVkMNTI%mg{#C3rL`2qg#&iNx+$$0fyBwUX{p(Y9Z?Vp1EL_NK7&RtIR=W5)0@+Uja zi?BP-Ew25p9aSH0u+E{G2c@Nw2t>bk*sOekZ;yj3{h0DePf@kjn?Y0i=o8;B_tp-m z5|uk$+VE#pRhcSdAMqDu8+urYv!GMk-0|E`ocJpZ$2ruPMt6bHa-5G#a<;CB36k>^ zT{uy=DL={{(&Vvoh>D}^oy$x_$7Ox>F?^v(OPsIIwSbi7sLrv<^O&XUsP^NYhRqO| zkSwh+iH5ib2B<0UE;&m{=VtP6XeLFws*x26V6YQcy1D@=Jmb*`U3yy!O^ofNQal{? zL6{!L&~C5re71hUjq6phq@9=gxN~eTm|0ul3Ii;@oE=Nx*)b>nty$Gq{JPAFMUpzT zNUQ?QOYg7X|F>IzuC7Td_)vmvNy{k*|Aa6e-o(k0#tGyFL_|`k*np*>7 zFI@)e_+}q*pEQvXgYd&6rcB!ZFtlCgNPdtEbdfo8uN+o_u#KSltA-n|9xuV{aZK-w zC5(|sQfeGO1^S<5Rf*9R2<4*?{cnb0y7xf#&lmLEE2KTVqR7bN63_p@k26)mV-!;U zltzBLE~ewjW<6#T%ziTUto-HS(_#tA02&dU(BB`2T)%};;!ob;dsmnRdy13_$X<5v zZCahqStm$er>Zv|w5p3f#}JMKQD}1ZA8j!J=}IUfJIwZTSoJQ#pyJ^#4$}w~a~9)n zIh?In8op!SkGl+aW%&))fBKi<-C}bdN5zvzTDZ8N^G!~sr9gs7h#Y1#bijof&Hy3-&$l3!P5r zl{U@$7JH9h$HF7u047x26aI^is_m0jMYkZ_U~K>6yr`h2~%NlR;k3)6tlOQlAm=rd~^jtK_btZ_&Ob++|IgIlRd z(-|a7;*d;7r!n-i&7}F6wbrRY8(X$kCt4vQEbOnjg(1t-^R)5JM0{^Na94|LVF8`$ z9sI&gTUVbdMGIv#on^=v1uhRpovQ9Tue5}Q&Lb+f=XjgHRCwRsi|O(ll3`Aixn&R| zjKg<^g2~iNn_E=8*KKOA>o5^I_L$dIrk@^7f=FdXfD z213*l?fWu;DW^H7pXQ zr;Zbur{P;_I2La1V7B5@oag*srN0T<4tX{Tl+&BmP`GDg!xBf;(l_63=gNgKth57YnB)cMG-| zB;(fn094h}u>4beGIGxawZE0EqxdYQp_z+{r;_n)K!dbku?QR4QZr8#ytsrjoBTio za*|(wxL*^h*2YqJmLkLp`i;LkZMQ<_tz3mdkQ@6x^$G?X4$|h2A0Ud?nIw-}TWG)J z`CuumbkbXGV*FlRfaMFF@3p*i2Oni7T3_R(6$HHe+GiZdkRQM>s#ObLTqvjfP2_r& zwZSTml%_lx%^V)y{PuSZ^O`|i3iSor{i^0?YLW?A@>SMDKK~m7(yQYL&&}P7503$( zxa*$_AQG)b3}6yUuXHuJ)(k-eLQCB|&KAc@P2vs--W;H27D@fHVC>W5#EDn$7kwj{ zH-C2vqH`?jtF4`VzSbSEeT6vLcSk>}8w0y_TjX`x9g_My{qezVH^c4DNLUy1MWP@U zsw==a^_}|2H#Y?ZUpewjY^6PtE()O5ys=fd(n&m@t;pf%TrK~5nai;*@(K~*zbW0j zE!pZCbAH%2uz4WUQe4{zd3&LU7>NrxJs-Ga^4ZvfUzrqNm~<@4=g4%QN?}MxkpvxY z$ssFmFZ`r@JUeAh^O)ElV+L&nl9}*pfzH~;&M1Ty&XY-P4O73^^NR>1Kd~ARg}W?e zC1G;;i~HI|rwUWZnI+a|s9ahIMVIDx(LD(*-wT(jm>@wR65Q^Nfb6}84_UB!_H*Dx z{TmTb^!B$0_EEC4^crz|W?lXyxIT}9>()P%=4%GFMp3zm1&FkD=pEMSFs9vQHnXcJ z&9!C}FOYD*#;p@-!n>%f3C-Y=2YkEaQTVfT^$#jCaQdm@hs1LkugIqI%{-MZcPR<7 ziEZ_o4pKQd2pT+WexHl@u6xreN9PiA)2ea>CEO(9>=UTKKra;13D+Wm*N#2p_YNc$Vjy@%hGY(U(^X;sk;}!A z%;1*N>NQvr+$`fcW_pFCk+{G4CQfln+|u-Z%9e34A?Q!}E%Ulo1db78=Y;9AO2r;5 z`PZqwSD!O}S{kyfe&+q^s%m)1LSB4{SNC0aA|@eAJ4eDZc82zgRo(i6@y(YNg7eo2 zd*e$|m>ZQP+BgJdCnmt^rGR*?Zzsg7i@=X?G^orJ_kD52j2*M@S6iG|ZJ_7ym)*Ga z5l?l;r9|3$zY&?h;5h5S)!!+dX`yG5?0qVobQ*PNf?~P*#ZYwuYD4Xd$_npc$PX`* zIj~r#V!$vH*3>4-@X2^sQUK46r>9Y$9?KnVB-LEm>**!PU@8#Wl9Y-sTeC--+=@W= zTwpEzB40d2+Z>C|b&!9y=W0|^sj4SO8N!>Kdkd~TSqrWMTgK&um@x4ZzTz!RNZlN2ek29pa%lK; z@bY61we!w>*(-rp0pJ*0HX0rLYpb@;*)O+LXZd)CLrq8}EL>&U&W9+AHoIqgCSRDa zkmd1O=LZyuOJT#12+R#b{&j^6=0>I?0-RfAN_69pf32PCWK{m_0Qj1(-|q-`xE3m- z!CCj_@jM`&d9d;05pWAu50vBN;-SbCr7J1t&jDXV*m`*>+svW49cT+D&T4ZsyF92{ zKInGwS+a2WywRY7&fCw2w+TJXg(f=V==zioo>*WJ1tLE51>zrpo@P1gf>~3E7<7e6 zP;VeHMW9H%k#~wXYezxbwbiLYMn5g`Q}0UdBYclClXj9W~f8?5DO; z!FikAg`5vJYa0H(EM0k-_}O*%c2TFpoM(-hLBY+Sb(H z2T)u??|=TgIMs6Wg>?TLV{ZC>1Z!u+h8~IiBa{*#{|nJ`h40`OyMGcHxBOz3IhFw| zoywhxf1X`_I_rsg{laN2>8f~+WKO5mDR%XqJZ@npILiv@WViYM1H&5O8HC+>H|CH zV8rAo49FG8A7)w-ICCf0WMQYMqQf&Lz{?&j5iuzl9>aaiCq%s>c$J$Ql2}?1Owbhe z5mKhA&O{4T-6$hpNsEZ5I(k6*DkJWAO5w?i3OdBu#ologfhj>pKe3Y`S$9Sez>XbF z_@275Hbvu!Fb0J~`Q0PbG*tx8#k9D5m=}8#z^MI%+TR9C`Pt2CsK2b-pV~fpEoI3E ze?!U_D_B0*I-T=wk5%&Ne$V}uID%9XOaRf*oM8stB$iftmcX#Tx-1ZUnlxJ>IqUyN zuznB^>-Jt9l2_aub9rtuRxExv4QI8X^*{aa@n_<>mUwa@w6hQJCdF`u__P9v1YqP1qT4~c`4#6W~QH(Tq0M9kN7A@9K- zxz5Z$tUsmML@TK^tm?MK`&H({uPpJ*8i{#Z^L_^D)K|;hQcQH-No8vfjkPe%80Ei$ zS|P?ozJx@}Wr-)G@C*~%5g$>J-#G@wsqW6*N~>l>A>sFfa5+QYnxGBg{!X% zbK9gfll#yywW@&|p?!ZT$#!w1XXVXa*ILAZ{(WH@FzcQbqtI6le(>te__DAOBSVBB zB52rD8J5UZ4;-27^4Ds#FC$Z(iQ9=lo;se!ph-)@$Cub`j+6r)ggd|U7;Vh>MWnz^ zQx++sTb7D1mwOjlzYSvgppqlGj4Afw z^Zt6SFtZQB0G5x>r7m+qlT28WKO!({3_XG5^V6lPU0==ozQD_$8Z2Ol)UrRw$1~=i zV@hX?B_T*&dgqcGI_>SHcAtoLf|8pnGd?Jj+>N5rRmnk=J&s@JDTy?RNswSC{#whs zZ(#IlwIF)p(Vd>Gnwo+jyP;?sx8fWyzMN?zX(8bpps@Xs!qvJ3ADz)JjCbYTaRZT5 zW0>|6O_JzaCviIZSuY2F)dj$13K>1Fv7fhd`)8F-h&Xr~*tXFE1COjdW& zHPk^n%wu>qs^nhR=M`jMK|-Z?hfrN7JL@Z>x1RrAH%qPkFjyn9E#Sv`wmWa@M>5DJ z{PHX?nmvH~_~zg{{72a2J|3iuDTA1ecC``MGX{3bSQ!oP{|ws7h{KH|YjTx`!6Rja zr>C@|PtxV8ODph6cehKJXjbD1KaYL z30N$7>_{lnz3A-~=I7}-@(B#S5YC<8lqmjl`VSS^-zw$EVEQiwTZL-%NVCZvyv; z$Nh^({SmghzuLDd*N^zDK*B|xHvH1hO<_(r^}sQ2Epi8(k*0iT=W=zoU@nPro)rv@ zLjQHZ5dcbj>H6wRI^ry>-VS_}Z-n1IRSKg$;CN#3*ewe`(25eCqB>N)O>r9{7@lqq zC53iepT`B63y?5`7N1gczf7>0bzuqb!24*$bUbig`q9t!Q@&qBFw}@hyQyW4Zt8XO z=(kxc&@I!$2-zo_(1y+|!k%e_eWf7V|<`%yp_l79wU<`#_c^X|S>3>D4MFjjJ76?1M zSrBHZPVi-U?bA5pf&zbGJsrAKEJfr^beOGuOCwMV9QqahBR;E?v}!|HdY#Of`W(Ky z-(SRUSaKtU&-!}&v98w1_H>utr}oN72Fa7k^O3b8+kryWm6sg>JI5PCHUct|9!zLdd)z|lP~%|Mw9;-JQa?H&-2H!@I3S-^ovXY@(qWrON$gh0W#(_Z)Bmh zK%Tf))xlBpo7;C7Dom8p%C|m!qm3Z)3#1o9NImlKvo&Rr@<&z~P$_k&@f?p0s)TDY zPy}JU-ltrv!HlxHmF5r|&6{ire|1lY=FRgeFY)NMYoD_dS24Hr^E&rO^VfaS4Z*=9 zES%EXED>K7=ARmD{bf@v+lUuK)}{TNGnU*x{nbtW`h2NHQhMoZnZ*Ahh2b2c*>UI2 zn*%;^l)ea+UUYmIrS2B*0<3~K*8Wwz>=>A;Qv#8LGY6217$CIAnGwuwn^h@zAiI9| zFmilFgOKUE+i%S&iR>5o@FjP!d6cFew;Q2OiQt{N)^ko9oCQr-rGjw9wjCBdelocU zW;8Cr#(q}?^hFaD}`7Q7IFb5-8CNMU#G{l7zKlh zs?r0YIW_z`6?ng`O9H<)3uU@lhf`3-_J$Bh3cc?h9JP8p-i6AXdTzGvS#3``rJZU! ze(CZIa|)I@Dr)IDBHdrYG7fEOj6vkfZP#_hduo1aCGBi>ID1mC111+jVy{buC*^Ex zv_UVvf!~k`=9#xTsnoY0%dI z^?$&T%-oSPQlG4ahGP`3z6rvi^p;8b*u`K2*2pRMo>$&QYQ~Uj}3# z*ZovKKTU%(J}+|R&D46DYTkP}{n0MRTUrrYTk`SW`s-P{^p_L8dGd~i0xU~z7T>(S zh9PepC0kV-e=rtE>>RUDTxIB<7R87@0=;|QiG%87bMn2Ua>hCGs*}~l(RWpVb$*Ib zz~?FDeDVH6N&2CR6{bTHX|c98HnL{a?P~h#VTg5>xA)?b)~V}M72{c9L0^4P)5oCI zGMS1yBq?7-*Y|B`KiE&F3*23Ay-LH=WxTa`{`(F;eTYjsumGmb~q zf&hKGFdA7WL6$@)gI|K~1l1jW9ikj77OC%GDl>*x>I%~@V={v4G2ej|SW^vr&@qyL zMex4{Y}#IvP5$`EwY6E9w)GCitnnj8Z^JmU6k}EV^Uts3b8ApTVnGtYQ(~fL(9%h5 zW2x6)={5MIF-v1}?1%IoLj^Wi+q?ucJ0op8zUskm*qUy$m?SC;yE@ljO_O)QrGs2p)vP>zMYvIkBWjVH__cIUnx>EA@_xIur3UC+l!G=rac5MNo#NP6c-0PrPO zc_^?>NSI*mof8T?xqBi-VtBd?%2v2~Fm(tD@$?PA(_JKt)w~`E?DZg<313~8CK4(9 zHL@^;fOQI{LaJpITi#NWfh~e2_hX3R7;*X4S=Eg5lxSXtd&=FLb^)-T~pvJ{FlVD`tU3Mgu z6s+EdsyHv*XYRcxmdLf{Muu-G%ke*3R)wreBi`52f&8>Okt*DtKeQG ztLd-i3)ItKVRG~4wvu>*g|zjM6Z7RofEb=M)l`ij++G5{%L|Iavg+)(YvUw=L#$<$ zCBGCq8_Ig*pI00@sJQkQ{wU{ejK-q!PR;=Dm<4wgjOTMVVM3}NkF4mkt7sR3-VRDk z0RH#5S(Ki z=6lpPmfT;GYr~Q0ek2h8kmw_rHeHz%6>TXvI+k486KJ)mP}&J`pz>L%jg2vjDYKev z@FX%GQjq0?^6yd2QTiP}2mU@_t9SxGzdJx5baCM>C|b>-MrvKrQS+S;l*cZFLjL!rX>FuyT;twkNfIHiP@#P*j(Qa)f9WJi}ruXyA z6g~vJFszXB|J8$fSW(jghY?f@E}!QM!a<}&U-@k;)Pm@%q5N-SwT%t9MGbzWT6>P! zIOZJ0Cot^=0VvV>;Tv!4we4$jh(>hS@fuV7WG-JlqAN3<2jtW7{+2`aoE+ZkS)7l< z+C;p7?Vq*SClIcGwd;!#B<${#{=%3FN3rg>9aVC|yetv*p( zMyhIk7>A!rt{Is0MNa;7U)?0!GaZ^@iDeoD)eO4g+-^-zB6g#wgdzEhtybwNL0QKEfU8m;n0tNQf z0XI#QVY%wrJxm4*PtLT7=OlECT<0?w{N)*3J@h`-WuEy7 zy+2>N=RLg4@vKcI>@@>3sCk?prk~?8{p|3r$+?z9 z{<#3=-!SY*;*ayLwm)0#=shbk6mOZU5;Vn9Q>5gIXRO4JQ1I(k9@FjsKUEu?1AYfs zBL1n4Co0ayV>*$S9!b>&T#@Z~yJl1f*Lc%y@cxru=l4)SLo>ZH*GVoM%4WLnV1y0540{{Si8LrRCY}5#Jl@`< zAKaCPOW)RYb8uDycY3AlO3TP;Xs>@uXX&zid&Scdd>ISe;WFvFzLqp*ANoSz>p(M}pdv(j6bdI7sy9TEuOZYhN=Hx+ zWe<63FK^G%6di3-9vBG@m&a{+YDX9I_MXvZe&!FMS3oNvg@p&j#3n+rVa3%n8S*Mm z;$*HfVVy}fydgFFsn*}AZsC|##iPGHw{g)wphD3H`M$(*0Ve%BlFD|W1p$lOIZX@q zD{!r?Kg!go=C`K(IUxfTYS|hCvqT%!{I{YW)q2j$Y3#g9J=a=lE`Q#=xe3?4lnl4L zF=5GAy6Nn=$qp2-+^;@C3Z_FrBM>xxt`su<5@H)ZYdQg19pw#Sm3d`@bJnB>>+CA0 z!;~VGEO@$z!@*>T!HrWEG#Nwi6VMC(5Rv}AM-sK{!H_MUVsd(I{8(J*iB;W_{T^*m zz*{?E3$f?3Q0!DJ`M5vqNnW9!-O8) zq8%fQyVu`rx7`M+3GlLi@Y|L!^^xZXs%&Hmk#EsruD286v6%yWAn*D}*vINcsGTNi zYcnPjg7blyP(`1+X{@@H&k05bevCqY(u#|wi&624-OUUKWg5z)07KIYSgMnL%3`^c z;h0mGI#xZ5XDH7r@;I?P_U6}8XQV4Q9oHlI*WORX*esEU43QR=s%+xUXXx`)tm8YK zuG%#_=~W^5oCLd0un0M7=1NObQuCzRp0OqCvxPfzXDnbI9E*TCrJ->2V1UmVYv4EJ z1Cnon#a*!9Ff0N~nPLx@GW6(tp;2}mAacg{9Y-J;fvSA6Bk1oy5MI1&(>_wxLpd9qnX9^* zj|_8YX1aR}AM)}vde??t!1&orvN5(6!f5ibP~3XxOfF|LXPF8qmVGFJDum5*O7Pu9 z>s%gMxfwmq#=Zpp`#IrH~e zNK5#{e4;&27&3acteT>qod#v@lB~xUTPkONu#i}2pZMFo;-~>$!>GQp@?Z~y;_>a_4*4h_7st;-2oU?WY5YyIbtAT`v zVn*I!)Ws2BRd2_Y-_ z_L<6?fEtFtPxdv0MG|?lV6?Z36l^T$1z4xhZ=$iJ$zNFg>Sg4s_6J2uc00O#)K~x$ zJ)K&cA~ulL{AojlCEmE9i+1-r(QkoCPg1SNm|))=f=P!tQT|=|Z=4$vBKGxc@6?xj zv~{$V6Bq-rRO$Fyt-nm>ve#y7$6KNpPDytV~7xA-5+uW#@(1Ob+bWU+LWmfs4s!af(CkoatSbXTtOX_^UAcIq4% z3INJBEg!h)jMWdHpWpk^=VMf?G|urltkkXNV+EcSHc(ZEU?k|fbM4;$K37ADCjBU} z@MCn+iNl^YpA1IQ)uy7j1p7)wF^_2z;!+=T+D*0-vS@bBT76PAdBeng8BwK^clVr! zoMh3XYANr1$%C(r6|Nq@ujR_YVj3w7L~rw=VnBgHXs^;P_Nla% zNu`bWu}2$fW5439lJ+Z8>ZeB+x(FFBTU&X$MAicjw*1e}FB-OXv(1|CKqyzU?Ql)Q zTG%s9%y9Mo^cYG~VnG;MaG6tDf+*CRH>|4kZm5v8CNyMhK8d!huvFOr3CIIwzTG=L znh3Ri;lwDy{e+Py+uikfry(f!ZYd7Dsp`&nNdo6z))eb8lXkOI;{X9g$F_oFnd=78 z+n5Q)OKm|ZjMwD*bijt>VrM6MRj{>*uA$K{iFAmkA%$I->wBqck+@F}e-} zjb%d|%Xi8BuW(G=ep8v^{;Gj0Y#8-z_(WH?#yU|+#lzHtD^S5lbyjVV%~vPTFb_9e zwl(mGmP+*8&n?S;*cd`j-8fs#z_1=OkN7B62MuEiQ6C_slV|Fr^ySot@3keb1eF1! zFB{3apXoioseV(J+B6xH@GiHAK?H1@|4i7lZshwgSDS)B^TPAR?G7UIo1AH%&3(k> z(%#)Xj8imk9bA2z(|K<=ap#2x?@f@l6sDgWp&+D8iv zz>oxfdM}wEK&LX(s~rhc??W` zYxoX}CMtUUg29q9K_*v+o=}Jf7Jk4<*pSZOA$v(x;PZR>uDj7tfIXUqK*Ur6xH$81k=l%JU zf&L@%3}U!2|A(${(x}cCkONu*i&j@ap0O9ij~^Y`ctAUdkJDKa4i)L#tQH8~pm(ej zA=fUet0JifQm=;@4MQ}c5g*KYB@_%<-;l^#rG!p1TstZJHg_w98Yb7V)-dI_JQAjN z**jL6M{gbcn&bNnSf7J=k1_VS?I&4p-00tPZp(B&u-9@H$wbJz8BZ*Pz#f3 z($tsn6T*szj?ECK12}^aB|QUOJ&=*50*%hdS_98?9%0^a8LzNN!4PR>+XG1%d z$ZuH0Kb14BSQ^utH+V%e^gB}ZNOn)D`b zH?wLwFq>p|o?}QU2G8)pM(iXhIIO1qYGse?is!{eyYGkB67qt=jMxJbV!dLC{EN3r zCW{Wif5_4~hnAQDUlP;<{he5NLIu%l;Pq5Nz+OGQLs>zU@#867jj^smOSC2M;zb zd;}IO)%YorzwJ-*$nI(PnW)szuh-{~-mJTN;Blb|zNy+<;u(o~O1C+k(B=m;AVRC( zIq$Ar<74M!KcnR6=aO(uqIu)!OD;;SDOq+=C+qk%BK6J#&=ymtMU<=_U#)UZ+_~v} zVO4S)up;(UUD%5KCoHl7Ogji_S?WjGINIkkxBMTPuClMmua9qxlnz0DIbp$TZfFW#1m{irXE(^wD#y=JIut^I%0*qfAa(O}wt$g(cd8n6SpfEZ+QfD99 zD;_pToPI?X=-cwl2!Y4;2|w5WGPg2HyQ?L^Wd+T}1D!ftA<->U1ly-Fq_9ZYc8{7F&JQb_$BcAxB<}m2nj8_lwbH+B>OVU}%8_ij zPp44dm?vPCFEmOMOsF{BZ%N|T;2YSYanmnbbJ7UVhgc$yF)lh$xQL<+Z z8DD)`ky;PaFWp(`3Q1OG=j1-m@cj};;7%FL3j$e@{E>bcMQ$1szMvAc&Xg15?Y%{2 z#F8oSM~)wm!c_NSJFcp}(mkT<^yNHS&-!ID*=XSV6}-+2$)u_FIa1O)G+qu2d&k)c z++Ee;Sk?*C(n_iYxAf)fB})DJU1NrxX1k{euIEc#(=qE5pOG%@*hUbS|s$s zWEi3W4sR=k)-Y>;^ri-(CPCR=&t9y*f+@?6EwZ8|ZgF@sl_zs{OmwT$eA9lKCFln@yZ*dzpL+;Sh;0ael{hX%mOn zae>DD4uyw*OwGbvf_qt9dq1B%70EZ!dR+vqY%cgGo%qYEeATL869vEWNVtEdnX8p` zQR$-@+Z^fe?(Xwt7T79nJv!cA^v?oy(z>@}Z?X`slj8_-3U9oz&aU0HtP^pw^I1zO!{?L$TT&e#mlk*bL9vZ1mP=^z zbaGEr*bGp=uBnyx{P0ur*YE?GsCPu%E0T;Tfw;U~5foLYiio49c?wIEPFE_sh~4%) zOn8oDhifB}38=xh|1lH-Cv+OOWLU;#MaBUR8a^s-#Qk;(5#0mtrxw)UD7l2wsdIYh zBPjRjr6K**7+j)B9;J#%dP0{x^5^B^k34m5!=gPeBaUz^+E-m=#M z-DfaijLGI-^B0E$2c7PRBa=-^)G)hNMs~bka|{;51rYOz3HCMt?Db9dug)ilOu)l? z>KN!*A6blF^XUef*LBtrLsehh-ltO4S7c%(3Xewo$GL>q{KY^qPKr6zgL)iH`F)kh z>({z#ISh#BbG{h@UZa8f(K;QM1e1#R9d0=4$bm5T;!$z`53VSVks@&E`wm5(l&&7~ z;nE+`dv81z)Ak~Bw9?^K0=(Ne{~>hhtLLP{-=mqc*irCY~-=gxx&rIAfTM)M@N@giaxUwdLn<8IwgUe>d|a{@PFgz7=IyTlj15V)OWg^E}=a zg9}@RpHH6bz@|_Z4{6@54yEo4t1@n9OW|DaHEtU1R+qfLuGYHtU_6K7-akS^-?;%+ zaQH19nqCe%H^KlYs#hW;=+iPC?wC9ZEy8-=Q3oW@SQ0YXFL*N0Jxc^YMFNsaa)}lI z!i^syZ0u};#e%w+0()LfYIa=;;46Nxv1Ld@*Wx#Q!x<)&2~w7#?4*oLEfJ1EUy?QT z#ARu8Dl15sosc!Lls9c^H89*pck|#lF|9iM7n&HJEgCu=<>&u3=t-QA@8QjCo1Oes z3)~6VqBW+$759y}%p_;G{$79E@szGH^TknJ8sEjH4mC__)v)^r`R`Z?vHnlO zoABT%%S1+Lg&B^ZWb`W4Mi=qPe||ZRLl|(}L-6jHJ8$`?tY*6P%n`_Wy)pT2XEb;E8j;21#9{xryZW&XY|`j<*F&kQEBx=(K=!^NlS0E2yYNB;Dg;Q@f@|&b`NN< zQJ)c#8YMn>etSzOMaq!(WZ|Hbu5XX6`~{Rb(6k#M1Q_Ou@e;+mpsB#;-H~z5LUAj9 z!9Bd_kT0b<3sq*%IshJa7Ru;djOcUU%Q=fy!EzH87oEO|_8{fkL+SDbt*jFMeHG#e zo%7qny8Hbd<6Pxf#B&Q@S$+WJ`@}0ux`ub~I7pb;C!zhQCfwUcJeH?rwcO{MJf%kT z72~+3rU|2|SbiH@;&L`qOJ(ZbTWkKc_rt^sb(MyW1Wq?NR<*|5G;L}x27G+4x<!Xox-OhAAF-J$O?_zZI-`B9bMOiErd&*g+e;+-JD8ohvi>W zbw4`WPkDW-mnjBJ*~$Z+*yU}5C{qk=fiVS|&q5nQ ze++}HqUb0hl(iRDW>9RC{JzYrQE|y@3L4@_zW|qj+c3uK7%-Z5fpu_Ol;0DybwwV< z;=hQHvm9LMp=7E`q>+Wy0*A40`LIry#7^1mHqnA~L?b~d#JpRuA%=3|hjHdB4@PzT zdci4m$_f84HeW%BcT5kAHlT``vue(T?8FDZ^Y7neac{o8R>vo;E&i^a_lwI^$#1Bz zy>>)?wp@w%@77$NpEf_>%ln_X3}}c3j(0S234Q{UKMROdHz}1PXxEM+7_O_Epyc`) z@i6ux|DB>-_(pCZ&bKjf%|qw$l&<^fg3OE6x-5KkX!_W`Gkd6Hv+Ob$oI%gvj#BBg z14zeW!YxWqk~klS>$#xiTIimo0?ccFeQ(Kr!A2{OpZ_HX3IN?L7PeZ@IQ+!MFq387 zWK(S1bP~obP=c2A)H8Jjb2R>+U*pS6F2!F{jq_(Nbl^|H_HCl#;~kz+iPvpJ^FdXy z)_wqkcR+m8ZUcZ^X{+?hD0XB_kqAvk=@>1BH9`!NE!W>U=vLo}!-{uq?^Pw#Ia@hB z<>8GgZZ*%P-+7Gt^Mo;2$JWQ-^X1yCpWL10$lETTqI~{O2N{N=&KagfP$E+wx8R1obCQrO)wD2Jbkd=n*^2XfF)xkc9V?(YPYyK0r&7u0}B0z)S z3ka`5=cOxocp~C+vD5Ro(bF*djs|QL=`Z)Ov{KbUWvEHhZra|K0f=bdT}+TqCRpC6!DgO`G;;%sjDv;T!8L5_#GVT~I;-ijcFd zD02rcl*{r|Fb!pw?m)bUxy=ou$C#%ZgbA%}X5oC7VeZec1v{gzm2crl%ir|`747WN26Ied zu)wL+&uy)0wV`!~SK|eIYup=k_6!dn6&^1eZdB+;CZTzgw-%I3;nxA2??AQMI^#L= z1cl4&gx|S-{|R(~+LAv}YS#adWv?iQf=)gn&-eZ8%aKWN`67CfXiYc|>p|7YbToi^ z1~u+;hV#xbOCZP=(bB$(5dH#iNQ9z_v`M5MwMlJBhytoKIcj9@0 zOzz681HC&88Pon%H7Z2%-{dmWVbC zM|4bDjmv$WysBLcoD%q)F+oU-NWG6M8X!e6?(0$Dc&D6aiy7}sEaKbu%gaLBd#h_o z-CO#V`k-PCgIEMH*!E+YsL{{_65m|k3=lt54j~*9E?>$*BC=7Gi-TRe%f1hTek^vW zKYD%*@F}?fOn798@^z7!=y!xa-WrTupeW0a4HWeq;4wqPP;_%*sssAx5?A^Hi>bDT zpVRWO z-8N%5>uMrOSP%0njTnA16x2qIasjXJBxkgC95fVvWa4m#z!zm5)}m zS`UcpAG5@oIn>LsZ**7(xFD@3#oE8@=h&xxJ&}^*YBMTS?D!Kq;gv`36H)Hd6HYsM zZ&Kb9vTa^91w*ny=S4#G&xG0EX=)Jogu14wAvggXt<4291&XO}+ zmR#QYidb4VH%D{R9IMldB6!^dXI$`*Y4eA>_TQWoSTdM4=i%kc1*g7C?@yWbaZazR zjgecM8>#;#0R)6Fbhv!Go=f|BE2NHZOr4&|utT|sIftT?UWrlq87icvtPJU+p&VU$ zU*c?ROY`sT$xdrN$Wi(1zDr|oDc;J}anC%CpO*K6kzTbb-r(Ir%G&7!8Q@z41l5|qI zs8KN&$$3EvOb=)| z`#?+uLrGqTw!}Fe-7W49fdQUNFdW|XdX&KnZxL+;9c9X)u9#4gHJ~=>)wX;aoHWz> zIrzDIti#lMQu9qFvVejPn%`SmE^!1_XceCElzzo9bS*AC;`S`?zSH`Ou9`w(gOnAOf3C$-Mx4A?&^Pxx0@wjYLjQwpD{SV!r z2%t@s)l;lA$yhX~xZUQX3>H=A*$67UBgosfoznMp6vshH*q%J19 zXIZ5XLoiP%{;h>$;G9NB?mXt+03ME*bq$cn=;HDyyRuAOa9Zo@VX(D{0y?n@@9@@J z9A@6WuPIzj{rw-qsllwe5gftUm&sp+mk3rK#**wqY`5FK$Uy#xX|j!uJD<>mw;mG( z9vYynTd(@FJyp9N&l8K#4dQsPMcPS}$Uiug{8gZ)T0YQVCmg@aA^fED{&&S9=MN?I zb}V(I{M;&1g5?PbKleijJS1TKC90U>!GGkZwEt#F+AF)U?{+V4>RhUi$M54fgL#&i zN*{L%VV*vFGdu*ywyR14ixq%L5Wn=8aXJ4j0G=vPtua@vMrg_;LLjDB3`J^q%Agw& znA+DGZoo%{rgfdie@YLF6=d2tRFpjuEVm3^3^j?*2!kB&s&jZ8dE5JP|n#nlk>Z&JwAB8PDPK2uwsU>OCCo03K0h zJY>ie6gO^A_aG@zP_LBI?rl*N;3@S`-gk*x$9QuNkn}VEbK}!j5=38NWdOtC0wLq? zo4#9Y^EpbcdJ?wg}F108))1fi{NLdzZ4G?NP+&fPQn{uf*b8w%(UXLukyh#XO8U#s0>;>*`8d=)bvn0m~d~{X<+(uVcY9&j-fyD!#yi! zX%=!FSbsL`gO}Q(71B0MuthS!VvvpxFz9QzZ~E8t*nzf|c@2`l6Fi(_@5msT)_Ews}f z3f$ioyb0y^xqe~Q@?yhW>RKyA*3}hBa9!lmXR~^vSR@&CgLve9*(83Dli@#89&-M; zD*E2FkAjH@!ISqiyy6?VXZ}Uo%PXA*ec*}Tr6Y@uwEUl#@TG^@yv1sxSVqijswcGg z^xHK{;`*h)VQRTu%W}8#x7FQZo|SZMObvdl)@ZI38vWIaA=C@7{HCwZ?=QAwowERTpg^&uM6*}X?av7+vO87O zutEHdE7!CD(wfz^OSGvf{NUMdJ znNkV29qiw4uZZ5vOg~G-SC`a3?!_%zTg6s>=rd8f+P!DKIxd0lT_3{tyW()DLSz^P z;Fh;$$&RMR+V^L?$){b@qnrYlOT()IL!%G>w(r6-`gG{5_kC-8NKUukZOFH@b+`N4 zm#u!A-T3djjY5$h-7t+=nTh1>N80gF$rumGll5z5KB$lVWVigbqP3qpsImdb(R|=| z7{0D*)gYWCO$`%a*VNl!8DU?eJP&si$)&{Ul{-p{N&P1aNpA(&J$nTbin#c88bR1s zRb*4UT5UuL_%zX01jHSDON8XoHBXd5P-FtxvT_#UN!~R=3l^em{6&R4zB6sjjO6`j z^?$LkgjR_hxAwb=psAP+&MnlZT=*Z~Xxt;bg03g;xrojKj!^7a8$OAp* zkXcrc+Vpaifp+M>xd{p2bu#rSKNro0x%G+%;eL$94S6zWO-`*AisL|m9}O`}Z+@cb z)d(_6(%o|SIU+{*j;@gysQO;El6NY7(EeSv6EwgW`NYcE$gw=VJu~R> zR-)5$auV-L$%_boO3lALvFH(Jt^JwGPt%_m{dWaEb@*7@6UE?5Y`ED-YG1U!Vl#2p z-067GQxuu7PaXe`#vysuEKHYK;z&t@;gUfyi@60`{^YCQfzlN#ng-aQaFoFbl4nu( zHt3N62PHYLHy_SN?kEMQ3MxAvB=a%JfW^1Gx+^z0%Z-B%6HV_##{M;lUXj*4RPb$u zUKxM)iX%T)rC>1HD4YbC4o(?27#H{8%1T11f58E zbU39HEcz^WD&yp`?A}>9S=8ntW7e6cNG#$TqR_a>swd8!1fC@`+I$Af>)G;i3+WWIwBzZU8yy5L2 zslqVr_@i;2a?Af~OTi@o4|6s1(bD5o0jIZq(Q0~E1(5AptgB=>!w$;CB@5t!beO60 zzNMM)E)cU)r{!hoZpu~9`u9=)^RFol!sBdO>czj~??Qk}?!Lq-Zb;|f*830|b>z>0 zPlXTn`_?7^NXJ*tz`Qm}QrkvE&DX`hfr7QO8Kel3I*1tpTf^GdNqS*bZ7-A5EzdQx zDf#u~pG)qm;TP#J@;z&XgR4y*@BL@kJug{!lLKwcMTD?+e&GzA2{n4n)bWA9jGPUaw+%Ci&3j8-_frKQY4iX5{*#T}Smp8sg*n3) zb(dDVH=ejqj{u$g>1cK|9#~#z@EwW7xslQKf#GBy?RA`rfaI|pB z(=s^L+5P{ybYS3h(_a_*t^HcXS!r?hNf(`2+D4&3BB{c4QXk$0$}U6+H!*NwE9n#5 zo}C`p>11H8OM00QR`TIGTkJ*-W}pOxaG!FIA1k-F++%5w6Am6 z8>s`ydc|SpOb&uC3Q;DVU&OK%3tBS3U=lH6;-WZjRY?m@yESv zVnaEzq6`lSWvAcw*La>H0No0hFXLwQ0mRe=cDFY0V7C)jf#eKgS~N2)6pZmM6_OeK%ig?IRZ^+z)C1 zn3sX>AHohRM+vpso5wzCb4EAGGQl_lDups<_A?==DRr<)&%;Y2JLX3uDhWS9v+rg% z#nXoBv=xpW9L#y4`qa8QY}eh}-79C;{p3YmSuN0h_s{2oK(U`Gm~`b$zSFYcSWO=F z&84_ob%UYd&PuU~td?A?0fOYTId9Hnk%PNJO?cRW8?qFy(*ARf?vW$J2cjsJ&l zGc=%ZcYu^#JbiZM#MQ!qnGp#ryeomrQQ~sTd9wI$Vyg+ds*Bez}D8^6uEEu7k_WS^{JD)pXMQ zM_x2@|3yLg1yut+u646cxm&X;Bl=T=|Lii=3ukzb=>)js&@%_E7aztF*I-2w`(9Ca zn((D~sBFjltBl)iWNq`T7*6^cA|@)bw#Q}3AiQH>4{PepT=1H_0e3UKJ~e*3piB_> z13)GyA*wnM6hY|KYucU05yPfeYVnvo*M7L?oZ-IY&feIL#opoms99#R%)!%E>&(k4 zNrZJK=M%PO*N zJwAfOyKsV#b^dWaxR4Gi5%46Vhzf2yRfznsfQRn3g#7SMQFI~+KxO4Ie(}1aN;L8^ z5!!ho*-?hH%G_etYk!Fll!3ukQ6YlvQTP>yNkc1NGH+xnjAqT_;Ui6~C#}~?%L^FQ zOL$~87CgsJcERLqRI;f&ajnIDpIwO(Yespc*QzgPm*Gi}EajQqCdO<`M)6wSS$n6t zPAN;m6db0Lv6sS?;C$e>&fIDK$pywykzThj!!5q&7a!|m5_moS&=R?B#l%s)dCp-dBE$7NyTq^)Zs|Jh z(9XJpNxqNUwlsWOZ{$aV?_%1&t4dtBc@^Xb4510#z*Y_=>d2@0Cq%WWf1Yc&zf6j4 zga2uDzhz14BgP}UZBzr0{*U3_iedHGpVY-@2>7uPSX}9?e$7S6B>Or=Nl8Wl2lirl z?xj-5%L~!|8^WqXB)Wq?RvEnEQ2FEUg`}^yeS2!=gdo0SM6GBkbGwYW5;U2)QLo}E zX*78!9hl|c@+HKFlE)u+-gIbQGoD)K?ae3upP%t8c0#T|BE|r}V2W|VIg|zPzUuDe zvikSQEq?FrT7_Pw#fVQ#gIeF-pu(ppY(R&|n?Yv~3wbP)9@Tg0j?rj3SI`rTdhDCh zx_~(0(&{Xa(B=hx6vWn#EP650$cU$Xa3AJa0#>Q=yaoQ zY|^C--`>WOC!+@!zj&9X-@1E@T)nZp$9aY0B)Z0#Yshn?K{H z;X8wrA7KqZ=T~VgZd!fknt!sVjXzu03^as7I*>? zCVgXeEh!^D;r@7S$Zf$V{FC?%me7G!?9S>8m)j}C;6!LGK()*#p{6}?t}uqrg%a(; z=jc|l-LcCZd77nvMkRqhQ(>!TLNw@Vqn9wlZ9JB z@Y zFToPd+={%}EVbxXV6Z_byaEzcD-R`M!^R0Jb|M4n|D8|W zq5S&|Mri&X#ffY?i6@NQJ6cy;=rhEhN;t)XG`4lf0m(EH;b}T=04Lmf;9(Fp=Ev<$ zt?>HUJ7&05EAO!6l#$-J)7EEj`fRt95J+0-9P;pvKC%h=7hfb4XEimP_KA*&$tCWP z&GM2f(2Cd_&p{!D_@p-?3DSTRncRP}Q1&3iZG3j?GX+!)9Jg=Z*O1ekll>;2<>2qf*rjsY;5syATF$W@nI zh+i1|p3!5E4DwaUUl9j}c|5}?_%OP(FFSCW_~kaEQ4&?;hGpF(x6OsTW9CObHv2RB zA0q3I&^?EF95Zb_<>Lg<_)4C61&N0Ge&`pJu(GB2I+ONRndFy#+=#ab3px4D`hy-1|6GB zomWsd+T@q}*E0C(UzehF*P1VXpD0~{qI2jS2{(qrNz^3+RuY=eQ^*mC_Yw~@ZKDI9 zGUzlw#F15G$*&;*wGS1>&L8Yja-(^{5fdWtw%kPYcvkBDDD%zj)|J*UrbDArcs*k5pC=ia=H)pz;DfLQ8LsDV zByY%MCo2^y3t%P!%nB}h|M~i7uP#sbZl#ak$xe~!5|5ZW5-y;zNar>@+J&1_Erumw1HKb5h(Us8Fw8hrb0<(7{D#rMGa+k4DlU$jmWdYzu1G*G@3tc2qFQi9ym zv(AM)2rVSU_;c&rqY?txKtT=^!@GzrR0?)540MW?!2<&nI2n>Una~A$C)9}04tQw$ z#xR64gWi;SG(g0Y+oM05OmYj(iM67Ghoy)Fyq^Lnw*Qb*xJydwUlgTN7Q7b6?_mVw znwT#}BN;=j7J34}2gG>ilP;5ceHbDq^SCAVIVSAK&yV7-7J2NCns3)7QP9i7mL)+z zlO@)b4SR1^tNY5AR96C#5jaM8OibMQsbR!r~`Z~xB5DNZ26l&D?z$qdO(BB;RRjy7&F5hk;|1Rwb;d*be1 zdx}wY<%YYHyttGboBvDrLo@xF$u3DhLf>NFxFSe$(r;Dy$ReLLeH<@5_U@M4Gt}I7 z^?iOjp+ML4rFAN&gsqvvvRfNsChziJ!YX%JMta-M4YPEEGkq%%J5r#+wVeLc5gPbI zaw*s!aQr#w7F_zk5>B~o3JIYYmZPltcN|D`yR-BL{u}`;7!PSA^e_*zjx64U63f0iD>(2sg>VQ+Lc+l?{WK@$PF-ZSX)7&ja*leGvL?dO}*BBuFl5LK~+ae?+?2I@_U~%d}gZZ6w{nr&vc;}!j+sGWD&Km!B%a#3~@RuU)&qrrbQ>=4)!TUGE*Cx4os_o^pR|72~H&cMRdOI83=$6Hz+VwmRPOH|Y9 zmFx~IUDh*ee%iwyZz_K*yh>ekVs9pyM+`BP!lfdc8`+NAbT%Ddbgza`u728j$Y9_> z-VJhTtQ>;qGFh3WtxWsco+U^#p20r}Ed^kkaqsBz?_cJJ%lxAb?k-p-vc}=O^RUV; zA_SVt^y=weT+jmg)^FB`3O=rYU6AQMzb_ZwV6A zj}gDVwgk!v*&m_0Q7v}Fi$bR{=kfu8pw!0Xh-0aQ$IU++%Nz`)mJ7t}du=*@64H04c4;8C7tY3}p)0`+!!89e6Q$46a%1Hd-@HD) zl0RP)nmS-9CR}PTgdO#E8D!r1+%G{hT^!TiwO0E#`d~?$ke1%(a#9CtXu&Osm`@?U zqo+H3;mq05(BDl5(aQ%;9-oYU1m<98{E7kWC*$2%_pZ9ipyji?ROe+y2S^)A=kgJ~ zQa3C%PLf8IVQv#uTrONR^#cL-_SI`!Lfys8Z9^#M9YjKOTcv+;wKpB8v=go~ZxuAvbO~kPF8T`h!_D0v%1iZk5~YKc z3;ZvxjTMQMjGUiTly<01FKI*{c~p&^B-eD-;<1T_F;&$vlqzoZ`H zn)PQ56BpCqb1tjc*@Y07s25&VcW7|dCD5e)TiOjo~ z8<_nGC@`bYqW|Y_ZRS=BWLUxbfqqM!ye}mA=-}Y83v;&rE~r-2OSG1)vW4?hJnfHu zp8KNtQ#Cr?^s^^jgkG;pMV_tOa2_mJDvd~z<~ER%->0Sp-_RXB*$meN7sA45WCOnv#^=5=)$oC`=l ze*)O8`)90whX4n8R=>#5Fs39H_q*A(uRLxr>Wf(Cben+N=wS?4c_cFZK8;#(v^pMP z(XcJ#qM?c&&K!Gyxdr?TYP~RA1#1<{YoS&fIyKdyNCM!%P2N0DcgFM?p_N0LB26^4 zmIauqy;qg(A1Bcm|Bo&AYTTzWvbDq}*^0h`1-@1ZtE>Fy-`_@t=QP)^(smKYxHn0z z3Bjp$JRga~%{(HoQ*dp4X-m(AE3=?UO?FZv4uhfmpGu_9n9ykzONH7o``$l`_E|$O zj3xms^^#Tkn2h;3tv88!-Iw{;KH9^%%>rnT=C|B1zJzA=7991ci*T32>)LpIKg645|=m;ey_M_4ynk?6WFxOBDIv3|Ck9RHIu z3b;AKl-!=IBqcWaDI*XjQE?w~+jBE}QhXG*H>)0^lQXkQ_hz?sbE?wOt-PhE>Sh@hXSXhj?*E4Via4H;CJoPQFg~ zrn%M1k8txQ;1ABu=>}i`C`)=i@02tE{LM%)H&T0$1-0{DS-op81C*o{)*V{^og(-k z3a!+*F6)(wJeaRYncZlAc;mjX^GL8I63@}=i5lTT%IhnuErVMRC1@DUEWe-qoMcnQE~KmG6<7-4FxJCwi6q%bVwdVf zEs2gQrSHGBZ9l#2?r2k*UG7_)Y1q}~;XC%HCD2sFo;4NPNZHllaXWNsx}wUC8?8_i zF)<<*cml~Oog8r#DW=y{k?}$ROaYYP==phUL0*}f_d5P#Q~qCy{PdN6=*}N;#CdV} z%_J-KDLwbmjK~wayM{+4UF%fpS{~bP{&S(@MB!EtV~mtXTA5lbG!wpHuzEDaSA%7D zmakQwd({z%uIEpf-4x3I`{OpLJRn01LO1B54=#juWsQk4AfqC78{G|XQ#BYfK7kDL z=qrfgI!dUu8~#mQPrC^(qln#2jx|jCR>k|LZzizwYP-2M>}BM{f0G` zz+ud30iWd+r; zBxFb`yo)ajrDlaezNlVA>0R{mu#mQm;H1(JYNGSUqZXS%ZxK(8uCJv5BW<`o!Q6Wl zW4;YP)lX%uO=n?O5hLY#zybM1?7%N34OvpHBO>`IVRQq2kYcpokbo)l_}vmQSH>Q_SHDC>Pp)Sqks1hbJa zqU5w+cx-Re-zb$V|5Ti*C%*WA1C!eeDJUcOt2td}?EG|8ma$6}ieqAMENcBReNTu< z6-xi_h7jC9oy6%_O|tSuDZ1R;yFB2iJ+X?ad4iN?auedi_qyz5GDFAIKPLAc zd|qi2E7bN*>@F9TyT>7ALoxa~j~q|p6;RDLc}_=VXzK<>{LJq9<;5(I`ySxP|1!U~ ztT3&1)Z;^C&0D9dLP^>M=HTWu!vhYR2`D-mNvC1=h(6V|A|~ymPsDpSf`Gj?bl{`s zHk*~-`@gR|s&;p$z*`URa8CV12pu{v>Onr>%!J?#MKG*Gu0a0i+)2%+DI!~s$A?RhjbtxN>BqI@XQzLaatL_63C>5@*}0oCh?*4y z`6AZ|x+xdGBz9sD%5HsXXC#CYZq?IiH+bYx z4;Jq`;N!fvYYq2@iLtc>5_BL%yhr>(=mSJWgqwmDGO0KARDUq6hNAvk9%#owkZ9FK ztoUJm{0g)t*CN02j()x4uH?iR>Xw->7NrX_4@+52Ckn?u8xx^U5?U2*6@UN?c=YgU zYOQ~ur-FwC%KY#znCO!)QP5a-OOgJ8QFdRon%RBTy42L?OPBYDP)+2p!}!r(8PNuQ z@+-z&bZsonqf?jA7}8>Y{+5u?$Cfur9}k{|&&2X7n4)^&%{Ok#_IZK)$rU}wo8VF9 zbRTdt;d<9<0PjJw%cvnTXOvdTNew|*o<(2r>**KH-^5^l*adHWXE@tdo+HJFaO1AH z+erU+B?1Iz;*4>T(X=Dz*P72k_U<4iv|MT;qW}706`+i+#M$AeR=M2AA0&XNB~caQ zdXo=dXHi_zkJ$E>J8`SOC4ZH*_7SNcOn2HS-c~$Jp`6}vqVtU+V~P`g<-Y{O1LuFz zEv++IKOJ=fkPdBll4Cj(^d?U?&AP;CvrQ9fLm;P0C_yM(&C9$6F0=}E z4mhY?I5g4wT5I)lG97hqfwljgWMcNSaa*aw!+%EzTTf^D8D9tqt~Ir=l3TC1^ERPr zwPd|}qki{@>d_jD`wu!MX^3I}U5s(t{u2(<9ZvUji0?eTzGx6_p7K2=ekE)xkT!qj z2B?e{lk;Nsn4c)JSO_(Ukp@uf+8#dcf(($1zie8!^ zMUSo`Kx#7|5bNH6mf+PUAfm8Ta?&3qdh(Cp0gHiY{-Y!}Oe6cLL}UyrVZd4<9o=}j zVbEUgPUCwZGEn3#HrB?w?g4-s**fO;G9%uNF#2jTY*!GSGTX( zoK`7WD()!dQ6z8PdT33){@vog=!da!ONS4=3}o-&Kgh(%2l>>^D+Xv7U}*zpV?E%{ z?mn3YJX5IL+|o7XL_5i0cqxg%n)}ukQ;Lw?XJ?=r8>vs}2T-Hc6j6NY*E@P@m-HI& z^Vz8qom`F}=q`MJ-xXWDp~%zKY+$62aop=~oQ|f-DBT#qoW;F%t-AD#zTJ1RwWkpa z8pNDgO~?-k;RyO3MTree;U28sXdbn5)gQyZBaZ9XCF1B?A6Xi&?3&V8qMEMv4p`EH zHCQkuZevu_N}Ie3hBcCw84Y|HQDsL_8w_&oIvy8&g$m5_)7)~{#z+2@kG3p_v(nt% zJSc@e=MMF!10US#HV<$%brLtaUgoFKkMD3U`2&n^9(LN`?lJcvh6j?!CJXnI$0PS` zJ2|(Mz7uDBv#?n;--L~nqm3BUl~Y<_^}rtQP;pNt!Hfg6o5RpGw;;6sUX}iK4Z*csXa#_>Z`^=6(1T4}nm;?WX^`XJ6_=lrz?Mk_+_q ztn51W$D4`1o}zJhUEe3vL7#ncYq+et`zSIwy~+K!a7f_P*0K7HnS0=M155lqdsa)4 zm=ze-kLOCcX`N(1O^^C}#O(gaYU}CAM166y!hCT##xMC5>3l9CTu%s0is$haE*h1q z)@e)CSM4MjRBxlGs5i+n`}B1dgMdZ)2~H!eko z`e%d9x+>A%!UA3zJuGs$VgK+=T=vWg?(w$na7(Ws4SAz<+Cf*eh+HC9*xQHkgR;)Ixo5xI%5~Z z{r@b0xgJT(FQaKN4uc=?n~mM4uHT1g)1Yp_D2&BZHEU@aJEOKau2jjsVM4e$g3WA= z)BVgNAhojhONkBpt8V;i(jWdEA?+E#hE6Rp&j;4G`OMqzScXUz<;JD zOx?&!*;q^Cc4DE4v-*>uNxpz5l1jGa$ZaD2dlBI^Q0y(Y#VJ!f%Obq{mmpG@sdF4+ zpm6I`bKM^5JcGlJnn(Eq~^Em-`k3^c4vE9Oi2Mc#<2X zBHGC=ggv_px%As0i+v#J)tH&dqg_6I1Miq-*J^f{3+B4rAg(WAb^mm!(_KM1?!H*f z#{pdYZ_y`*D_(QBAV*`B}%*Si+?8D9q(1wSEY$@$N9 z-_I(~NxwlIrd|I;#zs4Rwge(coD3;tzD}PB!A}hlN#S}6eli=}zH*73A}CmW23O=K zkZ~ulx(;FV-;bPe$2xCuXwzEv7Q~TQ_(}K=psK38n~sP<`XdVv!EDE&(S8VzuMumU zMEr9bZjeMiM9UBXx#KnWIo!B7_sT~KwIg@ALK?Qu$tVdAxa!-a`lgAY znGygIJ1-qCIe^lxL=Jr(+{^4vKw8zFQMMj4X?0BRTy~0^*+q*j{xna;H*Y&z%V z{1;j1zhlvtnV1%Roo@G#E&3F8u?s+4>rm2rmwLRnW4&8%#|1|-;nPlaAqUs&3wD?< z!qWZ%&cY!q0j~88m}1~4i@Vk-v780rTM$`a;YIH_?C_UyWFr>%UMw4yqH7>TRjcp_ zWZnxq{sR8}_n*3C-u+Gkn?48?$Uq~EG#=Dva=APmB49pp{vG&T5Uc-n;I(>@d@w|? zav+n8BcV3;B02(s@k^IhxGzx*JcxC`O}GJn0gp>}#3ku;ACUhWTh`$9FG>9X%m(2| zY=!`k)2QbV?|coX<6DTq^PFuU9}nsyg{tBe=>`li@8maNNQA)b)PYb2?CEbA5J(At zGJy8_iSn}~dUCMV4;bsz2AfSh)DmUWzD z!2=+9$+CaDfPx=8PRX)+-~wnju2#+gd9Ma}%ZuQ9AmL{?hF0ceaoNvecTb->*{whU zZo%v}w*wjYPp3P~T<$14{x0e<0Bpq#kdDAG<&U5*a0Siy+Yls&P>_2I6SfZ_KBV&? z6S=r`hzmC<_+3L}A?3$UOa}tk@J{A(BLF9h02vokc$x)~v|(;WDf4~$Y%6VSXIC$z z9xDJV)7A$8zYDCj8~+&u{26sP2CVL2jd1`j%JJKG+^<~!6B_*z!z>YT1o4qXP@Sb$ zvS-~22#k5gz4_!Q;{%@tot$6s$B;1G?VL1+Oo<#xeEBw_mFq6 z4~-1Shz1_fC8PLRIFHkz>wlhxYasoAwPqqGtA|;*&`tW-VCg{I!b#3Skn5+w<^aA9 zZUHNK&XXv_{2L?VTHqD{$Z} z2OYr?IC2gB09+dn)nZ=A&xUjZWHu;n+$_Wb(`N)n4F_{JVtdkPPbUOY0w7|)YdR@Z z55;BjZ^&QS@Oy_oV);wfuXi;F+9OT}IX(ls_}@UdZ^Pcz!k=~Dn@}CZ?!S*&-dX&@ ze|zGBnd+SiT*HF;-4qBAx$J#r*&-a$AiIWzy;3Y@c9u!X-zD>&g%-`A}yaN{aAM8ZM0)H9AryZ);)|C(@DrW@t_FbGB zodUUia_pS@nq!PR?)BPkS4@7J6a1)J6Brj$*o7{k4KK<05`Fd}#M;sXPx(`w6a*`Y zePkjj8NFa7{eT}hV)-Jv0V0k%|2X9i<6A+*+MCz)^^UQUV|g ziu5Kg1k9mN&aS@}EJhaBiIg*so%KH0t^f6f?d}!qzQy96j3#}Op&#~jCyHx7p1x%4 z{v+6|i={17yh$ve^N_7?1sT7Mf|;DqY+8a}0k{$Q6#SB^mrNHI($-ftb3E~4WZrNZ zw8IpOTdZd3AczZ~wM>6PG!hr=M~~p-&2G0Bg5hI!zApm1rLfE;W{4X+QFNaCeqcxa z1xkK@Ne2Rg!}ERu^lz1r_rAFx#4RAij}b}eaHG`_AxYWk7plM5^0FX!0YXE_SmWd~ z2$lWl1ITRe5dC=p<;mYr9Q^yPP3{$lqvhrls&cV_h@On1;xxAtF_5@Y>u_ZB|NM5s z{R%O54}#6SK303qD1Z44ka$kI0mcnz3%CJ=AlF*&H4jP#A4whO?@K0X-3aklh1P{d z(UxhfXv{n{L3U`8@5_^)-}UR=xDb997ea-g%D$2XZ4Su)T^9BKYu|2TwI{Kg4Od^L zYq#$|z;vzCeXX4%2;(Zqb&wsecD6Gqe;qr!H$f;LJhQev%#g+aSC5RCie53&&rXcUYx|toIWk`5?ZR7Vrh%re;UN_DJE4uF*i z4Jr8fAOdX?g-r*+uad1iMAAnP8qZ+bI1kuGB{Y0~NnMRC565QpqWSKBr(O~Zd=D}7 z8D^HpZ*;*u>BFt?SxhN!ngz*6=q7C8L_nDuCPA#Id>!QU@|HCm^7qK~KuD>qLB@G> zGGOeScmkV3?-e8TO@m-S+yKZm;|9#dH#SOwawi;)r(`V@JZ^8HCRM#DDFF}>#81WG z5|WpEUpsy&`iTR;jsZk34Of=QA<+C z4vS#h2lu5@PK)T9Gw5$T<}|3xLC`gTj^^@1=B~H_gUsXE5CO6lNVx$u0DkinkrDtA z`zjt9iH$Yk1O!_bdSZTqS3vzpwU9&_BuWiAe9}z`H&M!zdR+x+>!tAy-_uPh+ zraY=s|NeRO9LN4%aXmN^I)t^ygvK2pM2%j)`PlQUTd15@-(IDc?dD;!hkm zU8R3|LCWIJ)HH~I_qqY{A228224I@mia~HMxm+odLdl4kXs!I)nxmkVdo1a;Nt<80 ze(m_ZCNaC=C&AzQS@Yye>AT#txz9caz0WMS6WIN4S~?fs@T+Y1U8$no5WC_Ykm5-c zvc6Du z0alCQP`t5Y24;H;-AkvJ<9+d*I|h4RULJK?;c59)fJ^co<6ukR*P#+ohPV*&2>B+2 ztjtPJU>WcTJpqk&&r`A6l0QljhHoZ^vF}P`B1?e>a0A5E6E|R%=LR$$(N>}*$$a)8 zOeVLc%_SuOnzri_sJJxzwd=pfuKxkbejvrdvi?eDOm5oTXDx>)ld)hP4ZDVJ+M4IDrod1d6 zexFyvULgVineD#UqW!)YyL26l^CS8JvWq{#X@cEY1`I$5WbE`24YLa!CfyLb-)rax ze2aYiF&uz9DEK|#n?wlO^$XnyHB94q21si)z7F03k-firliP;5+Z>c3WsT9uL{hll zGF+sLTVHt~Tu5I(Pv8H1>Z1D+j?hVb2He2k!~ggD2|LK>bu1X(36vS-Q(+QqEhosu zvCcWzLYWt^G!ZvO))g6aK7Jk)E+b#ueD4b9VB);dz*0b5fjRSVCydFUKWaE;c1236 z%cscIj~uZs&8TUiDFF}>+@Ah!iLoR3ptOV~AJooY_i`~>#~}DlU}2Otxh>FWXHUwk z+OcXS7qv3$>p<@P6UgmTcs@?3_(*Is18@1hC5S1ey-#FFz1nKcVS z%LR%!C9|ea2RSe0JcN0$_gA`a(18CALf}HzT|3IwdlVd{|NB8f;*RuVF`zLmIlthO z3Fu*PGmALOBx>Nyfe&|YcF&>=Da2VXWyqj%6u`02S3rj3v|@hfce1VEbR_Lq7mGX&C5NJ+Hs<0nrG%!hH}E z|9EbTdl~-+%jeI{csm41v;wHqX6bj#f*9NZF({%*UL8L>cnW_97cr&mx8sJTGx3{_ z9fknf!}oF&9cLgWSuuZ>o6EsfJr97m4YqosD#{iW;&lUt#0^-G;Re{@LgPn7OTF;} zxSn0mAOKAXfEXRo)Z<0UKr#us&yHWzzt`}O(f(>i6)SrZb)1g}vw1Sf(rzSTkP_Qn zfMC2{PH1?5&#il*C_KfVs)QHQ^lnK zB!@otf?dDK`W2I>1B z(-~vaplRhQ*&v?N4<)k6bdnksOJxvs-B3LHlX6s`AQhvIU9f8p zaBAN%Wc~eq*DuxeP*8Sq0CX0X?oGr5zYza@h=8}ZuQzS>j3b!x6um;l)csaskZyT5 zyoe*8!)MKt-*v1YbY{IjLqL@jcqa3wLzZMAAWn(Q2EBkaF%rN+-3K#KhMDOzvga3@C*SL5kd+=;1ZMQ6v&Av z%^tjV)2%0-^a7#~0G0_qSv)HIZ%CL7!@)lm8u$9{6H(pzls!>V+M1-W~QqznXO?ET;;ke$r+V!L9 zugUEj;I9$eUXy65lgtnEEsg~$pr~3{wW}3^({4?RwKY!*ZBEepVZQdT@E^kb>mmqF z`T}y@`ZCUORtL^;;^-da6j8TF5jXeXa2&mI-5mMsKY7WWg99+AT|Yp20*=R=v4HDd z-Yr9ue<|*Pd*K#b#VX()>|mMyJ?5Kq1WYjycgZ^8?h)&IL?3 zk2>FbddD}VV0B7HD44S4KBNRdt97Y%{c;1UUB8_A=!CF3YxqY3@nf%n9DjD)yF zS5tA*B3mHnixzUx8eKR>`G(!!32fqDfmQy|o-OWq^Jq8QJlYk=qM=^Pq^uWLy_Ycg z`|huE?h~Bc$o#JlO>ZNHvcRF*v>B!@2P{oR8FGP{KkmW*!A)HMehIhWH+HsH;imL4 zhD0=EV3k3NckmDJEzWXIa9ZN4r#lQ*&jBlGxK9&uXo4U;5ZU~nzIxMre&me%no}QT zA<+jx;LRk}=0Zr$_*05tkvzJw1SNkII|ES?+xRR*0hG+Dx)Cw}kRk}{hb)9xKXlaa z9Jm29;09nmTL?bYtyr3SH6V}@01=_8d7(+{=SRe3HH=2hWZu)OU4M5*=C56U(*!}5 zzp??$f@KS9v9oo$Mx`_DP>Sm{fO&H7#?; zYap;+PvHQFo$T*8JCXyS5xzZ=Am0a`<-T_|rjF}yZTtp~d_H`3tJ_W+C}#drUrps950AskjEL{sUMY0g1i^8~nrFTLM^RbSZq1jdT^P z9srk)Tix!!$qOdu|MS{y_W(|i+Vp4)ue?;vg}C)6sFV??OWA45D6X<3)K$n|Ck$#+ zF@Iskov;ynjmZs&)GK~!M2cJ)1RFhdN&rOcAwOm!Mz3u&?eH~0#qQUxUsSs(8`ZWN z6_;jI(Py~9&V3MK#&)tssb+L3bfofh6f0NaSK&1@`aginArnEJf;UCJ56Dzpf%VwL z>m)t7%>9(z@fWzRJ&P{|IZpDN1cB7t3zO#=#0%F@y8J^TlpUB=;vi@f?ty3Fy19|` z10?5{3uqBZpZ+qhQu3*<6_U6~n zWXo8{8`9#Hc=P2`_FNB&<7ysSKw|e zUogi_e?V*$!OOms%o?b zl^-i50Aj$dp~j6&`35+$SM2_KQ0KXJ{eFirW}c6Epyj?ryti9^EJ`Sze;khDh6qBGkQe9qwr7 zE%y?RVWb}*nfFpoDVz=htz|F;KV;Wu5m6j~{Vu=#r|?*JcLcxvYHp}9Li|%{&OF@L z?&N#`DLB4?wZkfqvQDTgwL6!{V*%WOecLyRbS&o-@@NRpNrjE5`M&)vn)b+s4g}H{X2Wej1mI8=~+*~XLq$Kzh95oz*L`&VK z1VFQOrzWfv`(&=8U4K7BfNA&_n7m_Nk9i2SJoMgfL(idD+1rwFLg`DzwFQWreG+^2 z57`nU+>pi=S6##DgZuC+pIi1__Qro0lF-m>kaGt?eAha{=xTTOc0H;LclGb&(=b6Z})U(Enal-<0A6(s~#w$Z!1U}yh@xviGg-?hcIAkJH3|4|y zHstjAm@$)$#Ep+9q|1H5c;Php?t`>SuPpVK4Z*wcbM{_0rM_pPJV`Pp^Wb&x&Mvv% z!{46dGX5@zr_BcrjofY4A-qePZh#yJO`XPIY%*@Z_(nceZ>l>hpDIw+ps_;qI=rVx zyZ%7Kzw8C62K9|wge{INGP&zhseBaQ{**Ru;^LXhM*beV$HBYz-OYo4*{N=CU=RQ1 zuFY=C%Ej43af;^>@2grch=SiOSnr3Snsh6*xJPm_R%DH4qeXfh(7}UxQfncN#>i^A$h=Sq*xLA?~w7 z9E4MF`;SxZIE29KxLkhc*)4W}&}5d@hzYCEJigF`> zN!kz@Xz`-bkR>w9$;gjQe=R4E21%oNX1W2g6o4Brz`7){fDRYp03@SKYh2aHPz`9g z8dt9&i%SWBhU^w!sbJaCWYex+p8W1;*Uw}V72ZlgvsdlZA4|6W#|o9CEm4qmk3uHH zm_`lYZ0H;^@84t!2V#2#1h*DNLFor1p(rTXwQSV?!X)oYklD|abA{(#5RZ_pz@i6k!N8aU@;LY$Aj!*tjVv0>`I2CZVn;IN$MD&V-#0F~LHgpMA;xqphcK2T&=6nGZ z1MH!R9DdGmGk1Oxzs4Z7X2I{B@(s*|0R|jt{V$|~LuEGrTlv?HYn(XPjjP=b_wI9B-Sb=4x)mtT zO-CuJ9OXC3u6yx2?`7BDV$Xk}yT|oHbPUocAy?n3$smz76t5PbXeU#?_k~=yZ*(hh z&LjPVBnT0jIX1AEi<=-H_3OUQba>7IxD+=ydtrdO4pC3lRj`6g<-|d} z=i!{_vS4-kvU?vFx_jYLtbmZ1F>Oj&u+rHEQiha*-#3UQendQS84gV^;CmuQ0hTsMXXCt#JYp1A^Qe#LahV`%qtXw#EN;L|xB(MHhBa7Q8>~7t@2gOO zDF+|~jAFMnA2V7cWP^MACLU|>%B5YuCRxoZQVc2yep9>DW?LM~$=5On(pLva!YRsG z-r&T4xq_nL9W?QeqHy*q1i**Lymy(*dzS6MiBdA?_~HkHL+%pJh<;{A{q>R4?s(^I zlU-{kEfIvuln$e0{L{G3hsC;$o%B2SD|o~2sv!m@#ymrzWaaWi5OJgP02aYzcyrel zw-v6ytEVoy&kMhTUg4YO$q385PcN`0gmo{HCU;2Acs#HE^ zj%d5ZJOzec30A%2f9~oHw;wtG=lJcHYv+E6XedWShZP#WhSDcd{Z+r1ERxd{P1+Lg)F-Ln=0Wel#|3q$$ee)IXACx{*^S?=Nrx=-#b-K?`?}4L=LAZTKUjNPdwOSLk7yD zSb0hsE+-r}d><+zoS*hJ17tA5%RWxpyLs@0JJE5&z5UDv_a^3Wo7kD_Sl%fh<$OW1 z>p_(HI`KX5>uGG%Ie6#r1r+SW!Uo$9Q-{pD3A>+iu)20L$!qnzx$e#FI4eRKaT{{} z*|_tST)iBwL%w=NdFRU#w`0|!`EKEyS)4k!!X3aFkxU2w!dl@DIt+r(Xg~<)Uc|rv zZmDnnc@j6g*W5cgmKldcTUIZ(LzxQcE6CMzKSa`1VD&S*{V)GG;f|nR&<*^w;}_!3 zc4hX4m%bm>sPKC|7Yq5)>7D+(4nVQ=#lG2wn95{2Hxn~o-FzMLean}>kMF8KIFndql**syPq@8?rIdo5q4bN&qxge>|y;eEG}F&shFUDD9B^ zRX&(X@h)}PWj~v`QC@$h3~`Z^Sn2u zJl;#@edOjX*Kr4gj_v&0u;pJjM?xH)lj9s`2-$TvEb}9n*nNSV|HsoE=3#FL>dHO3)tnsgkr7QMrd`@HEgvV#8 zd`f*En{(l@?&B58uh9E^>B4ct_dc$!dan*qTcsP|t(5ew2EmH1K{>5bL`ncOa!+a! zq2--Hikew19Lz2otmhZWSWOMVsI2s{3d29rig;NW+ zvg2PgZA3;sNC0ao z?RTE9VB*&YGV{n#EjBqna_)9)sn=nH{{kBQlJW21x&#wHui5XBuF^D0@O2VbV*adl zH+xf?xfgyGj=&E%FZ%w(c^n0y+W>(yqVokjApn6pj9mZz#ZGq)J&51YSp~ZFR+sv+(vW2_L3E?qn_vK{LOlfFjJ^Cpsh*DO8K`JExA`Y0k zjpZz2(}->1Q0by*>G`QNp9DaxDoA-xxZ)NveGvlZ+4)anN8g2)x?c1@En?BiRJ1Sk)|4Pbmt{r!CLTT<6ws z@}L|79YD$P_cNEwLC-xbDIVdfSnW4%0M}di=)VK3E&{75lP9_^xCRd%$_23qh765Y z)ruF(+&W4>`~{a{(TjeoG=0&`8|9-KS=$h+#8{DQ+m2e6mJ$FFq3zLc?Zyoy3Hkfg zMz`7`wrmJi_-vcYXsCh9!^mID$S0vBp3yr>n=Yd$s4=9S@2Hw;{hBFg3dnycPI7kR zXy-Lt@QU5P9yWSAzWI}OoRd(~B4r~&Ktzw+C%=t*;jOEexV@MTeurnpUooHTK(_CX z9u}~J>(NXpQv#rod&?FZMyswG-C95j2ERlWi&HRhV|?)KEvGh0tLblA+T~Ul!YkJ{Bwq1(2z%l`mv_%|`@+k#o0KZdwjcKoFvCw&F;D2P78l6iC7 zRt_9`4&8uH4j#8N2JXTU5n-e*6ljF>Gh2S;+$hMOs+gBs-=nHq%wP>9jc;hobOBm8*m|Y*)vle9*XX$GOAB{XN&v*@+{gngMxMo^GE!}^P^_n5 z)M{Z>d&>=$wu1EKmRn&uSp)RrNa$S8U3cpU2LWB_#B{L3z46RuJ7ZumN_5hyZ^qCI zSK~DP4t~HV!ViC)b0@Fga@~0K8=|kI4B2#{((&1(V>EU@*)(jvxfr{hO`aen02;Of znLJu7{DPyo)Qk&eWcaFq+AZi>PEfiatu9RlE45tCfO^;!_rVG1z)!#lTm!$cZG+pl zXS3UZ-~74Q*C%m`UOCW>o2Gwg$X&$m|L@1n;|Ji3`xEC#H{f(++$YmhP1Zk|p%ofq zaRH*Fl@hpd%dA&?KmTkx>!CIwa!KK_7dcUF+ED9KQUajX5RX%y4NiVc1VF;tans_| zGMTtIzp+JDZH_!C6)~92mX064!)j7kWgG)>fDVV}9J<=+u65mU$1h#sDBaEOCH^*Y z#y~sz0gY-Qpp*J;+`?x+j(@(#=l@|jJ>ocH9S|gTf3{m9_i-}AE16svYkquLhN1CS zs(>spAB<5}X&Cq@Rfj#>(S6O8WTbanG8IB@^IyLtE&^7QK{(_M0}Kex^8 z*|y#-nU8(|&UWe*q~Lc?a{i0g+>d{razAk*;Hew8T^F|fI;>~ZAMzBrf~DXR&L~dY zaJq8tRhGt-|0^SpF=JFzeWCgndS6DFmZaAC1?4rxJS-Qct&y&ND@SXxL26>d5d|bL}L}nN$oX&g^>g|k z`V9Q}u83F2lXjO*7bkA%+)Bm}Uds}gC1a0`VCkr@#Cd{@RlQLC3%wtkW%Q>S=V$mo zD6^co9KB)VVp`F>E!tx1$8kxY$`>gC(8%2$UNodN726eXr0LKY5D*8zivWtw-h97|w72JH&^F_CJK0@rm3g%et52hc0$+8L zHpd26`-tG7p<&m7HvjL)`46DouS0vT-s?qV@rZ+dGCfgZ2`&PvLj4l=eUT!HekxFo z=ARuy7>u>J!zM_U8K%KlEn1IqKNhHex!GkWQj7+hD@WBMMw+tgH#TAIOB`E%vpuB* zK*RP}FqC!CUsP7A#r2H}^6-!TIwp>PS>`pVygad^wXvL_de|R7y857dzb5*3b`2sG z{5~df^oOML%PCoB2t4GS4jURmKj4@19q!_dTj&RLxHtD~b+7E&;?|-c(8kfjNh=DD zaBT67?jCm#`TjR3`2B%9-%jNI4>;9Oq6+J}0IWIHj`3J1gShWw6gehpfpR=isn6w_ zF!(WUd5kGE!G{9%FS~@8`B`*+viLt{`m*bpCM*X6DFF}>YT5A|F@7@9m^@lE6ynUI zC~>32$&_6BH>teB3uNh1pTw~8^%JT{=r;WNNutnQIBw{5*|$mOms1gCuRkH&QCP~C zdmp%4C(gRF*KfPS=VdXl#l3*};7VNXCgDivF$#V?9AjTpui(oX=UY4;Cc#zh=7N*b%b`G?Y?9B0Y@)hb+4Sc#q?*>Lk zjNPAsEWb}9h^@_Pdzp>TCeuy!!GpfFuGT~bqOM_Lpix?B#9X6PrhfHE34r>+zIe&D zK4_Z|EEHOFTABu+eTKk8Z7vR>v0p8%@(8w8Ee)%&P0xMOh!J~Uo6Q@24_wd5^X?oc z4jww)VPb)o;R>wc=;5#w$jn@CO5{8M|SsZyh+#wsROuo0UU?jS;M(mYok``O@ZlmoIIDyvsmyf*VAK z_C5xP zwvOj^O)X$ii!!42m9LA%&jm)*Oh_kSniwEznkI~wg_YV{yZWoxNX12pRJ(fDv!s*& zs3-JGmaPfIpOgS~yk>ZKS8EW*$u~;eC~*u9C;?jgA}1INJr5`4VOjX=*l3ZT@>L_> zyT}>VC!$rEp(7sx4<3!Uzi!_))4jh??Av$xlH0j&4W!MEjZk0WGncJ_~*>q@xO`}j^ooQrhlIsXM8S&-{i}$aO|;P@|Bxi z7{^A9{A7y_8C$)A$_1BfdkbY2C0@Qb{+-V^3|GDL&$?G!{~S)|-zkm{{=5H{`|RWS ztN*TNpGNoL_dGI^%tK$-=&<|!!bNxa^l7(r+9P)t#evSz$*zCeG&eGd9l!Djrw!-H zr@U>+jT0qau`-lCN>CoY>~Ma$X?>jB_x`z9TE!VtPN~Q~(*jo;>p9Dbj4e2^ z{4Atlg=RzeefYjySj5hQRU^0lwvZ?zp{X*1lKXu9Raup+U$BaW(-@rdBpmn3uYi2W z{Z{HZZ{1?#r}ViYx6JT&agnsUALw@j_j=s5o-TJ|@PT_Yd8&IjbB=p7ccF8$=Qs`* zAXTV6WAaq}^i3~)T{Exjy}b+K@OmsvlrEG;-+dlqU~eCSpHN=m*Wvqd_0vcHGAOTL z{^5-B-3R68R@-NvFG=2aN)Qn9s!MKt#^t9MxrOp-T)e!6<%t*MmzzHKd0ckX!Q$jb z)U*E^&drDv{o2xUG3ZZ1`QqfEM$o8MJ{2r7R%W(6*n< zIQQnNMLVO(KL*Xs%#YqwN-`k#@&Yl5gdZos`*|7_AM` z=;H0P)nQL_7=;)b0D)`AkG$UvfzZVXh!VsN=tmSV%n?7zW1hGHv(XKh8YOjW#<8(Q z({#1rW$80Ga=Y&JQdjDBWEV z4~PZk%;OJkzzlH%CI$UOp4MMneHEG+j-@~}pckFSS z05o#%sBX4fU=nGE)zVb877G8ThQ|m>X5Q6wL+}VO^YnE5id}CS{@V5T;tROUU4J2H zDZl%01BS&7fDo7sN5ESO6iU-d?`#Nmv!4@9^j`IP`sz3^YrW+t`K+R`}2SRPG@?VhEY%izC(D=;G>vHmG zhBU!g(0H-=sJ2%>wD<_D3IJ0ApekTZD$`)mP<9@%C6#rf<>Q-LRy%VRn6mO{G=LPT zk6-B?2D^SK`Elq^;=BGrRgi9g=LU!%kbZz!3ZNU%>RT%tr#a)T5S+%n&kpj|22=L* zKi}SQfT+rga(ksJm1&YpQvx7jN7Zd`sbKB$CTi7h;gH=al)Gq>E;bhFX3b=57eA*ao;qj@a*&jbG+{{8# zqnX~P1VF^z@|&)R(UXbhCy}wX$+DFvQ}#5#G8>t@xR#ksG96ciTulFiMoHE$8NYVm zGSwT=F+X(-D!&&(M7jYY1ZK~DVkzLqZ5^wZT+zTjtAnG#6?cI7G+Z@SB?f5K;P)z_ zCfA~=zF3vj!q80=lqms_C^(bOEEvYxbWHTK$q(~4eK@Y}3$-&hO^{BAM@o1m z>u1L=6F(6G)spqA*f>G(7V?*FfDr+6GP(g%r;&@=WSx_fV{Th>pL3$u^s!J~0zH+y z<10!9eZS{}hySL03YA~1jBuLpeX;y}JfCmWI{0+q_&yJRC60YA7RSd;_FDTAldXR( zb4v+;T0%QcSsKM!Xl#+!)hEv0WL|{lVpjGR#LZo(d|aQEu7%?0eZKgXCY%BH*?8f4 zhTrE)l+%uETaC|}Q$UnQm7SBPPo~!%HsL7Qdhtu7)Id+$nSuh4$1ZKT*Lz2Z@{?vs-cA7VmBq> zlSxwaH=7@^e11g}2$*k%Qw2i z@51-_iYxWq7vtI&I2_ayRC89qPV`#gCkSe4aIpmz`lKVI}z~t^W&U8R?dEf|0Rkl?Nd) zcxAGy*{_u92DR&lmEOm$Uv7PEA+1>rk}F8{9uhYI=R^`?%<{ScoGFl1sPn}I{pKGd zU5lkD_N`FnMbh|B;W1Vyn}YAj#EbOG!alwd1q+DrA<4Xau|b}8;irjcR^NiB?V1t* z5qrEE)1_q5T0s5cZ}%lrG;aj60$|1P?JcV%K`%8Etc+0MM|FPaW&I$3Dbtz1__~a| zvY?+gk=kdX3@jEz%y>eeO=k*d`Oz59o3imW5!-3?Y6d`7zm6#ZP<unWVLaImn}utjcdi%#`%4 z$@*dS%e2pq?De~*m{~PTbnm)2%nh(J1>{Ark%Pt;^|QHHZatMpFrS-Zd{)xV09K9I z&f4$h$^~TA)1?GJL=e|!9Q#ta+O2<%Sm|bp=64OAORd@^cv4^dYs0aWlIEnz0m4LVbQ*yZh4Fn6LzdhzCr(Q}Eq%%TQgkFn= ztbRwR%`Mv0W^YzX07UFTKlLL)g;|7n}%b5JLZ^z#RS&sXl_qGUx>N74<&I_NWYz_b!s7$5T z#qUDwK3tYw5X5mP)PDQUPrmO#^2|4%mlBl7Pf@h%mm?n`emlH}XREV5KBYnJ7dN06 zSY){YvqHK7rQo7I2zONQ>|g7ukQ*iZ&d0Tgd44R9AX4@|PrRb9lu%s)8-)@G|4j9KX@BVym4{-3rpZ1#wLC8_Q_pN^d}!ecg5oPg&-8Y;O!(eP?_a6 zvS5nP2hu5-%Cn$nf3Nq!>oFHz=6xTERP6h>Zw2xbA)=kGkekfB)c=@vqWwdN=l884(S$9Qlz^_gMhSj z2}nqZk`g1OTN(vKy1RMz{ax34?f>n0?%nq}pE&2-vJxeh86-{Bjbrw1`u^xkQP$3T zCypr4lIeZlxEGC zS9B+PCi3?#@@&{*jz(qdzbfonYQz#*N8$v8 zfKTmnLEw&*cG8jQSHVx8an>C&YRx6V85+z!nvbjj+WFzI!8~LFpP=k5$&=7#h4e?^ zNH?@q?C)K8nF36!7=U-w@>qcWtEJ2}ir@-@H`4v=X`U@_gcWxy74SrPl0j(_FI)v3%>MyfOOXCtJ~Uec0d-TUG|9Yh&Ky3(7fHx?knRwJAKA zB&+Q=4tBc>Gt!#=)pX!qbAi`5aiLaXK&Nd?^=Yi|DLm!E`8K~}9fi^)lix?$!+1o7 zX-YHojQwhDyw>bSv5-1(%CB?{??3tmIO?Bl4FAXB_^WR+zk+Tie?8u7^kR#e3)NL- zXF9pYW5kCN5coHuvn$dif1=v7Ge1M^_fGY@?|b8V(mK2jC14n=m(dyZ1zi>jNY8P+ zmyc9nEDVW1((@Q;ku!Sqk7={#auo8_R3p_spIaKZd|P2N{3-N#CwnNqa;-J^Q#^UiI;Db)OtEOBU#LEW=F<9P~j`^_2ZSNI3pMi zBU+zJ7?^%jdnbJ@^7fVJ;(7V->))9{d8k=33w&873!i%SCx2m72b&0k54eB$(P^5I z-t#h_={?qBwm=i09V2(=bMKWdKN6HaFDn`ELr=1Y8|LwDw*cJ-NN{6^ICp;P+q77} z@Q=SW(5e9-Nz*o!^LG&Oc)+HV zGG_g(wPjV>WP(`VOjY@)@z3YMee>U?+GcgyGNt0uRS zXPZg`j@VY}s$*R4#e>RDU@%w)qX~_XgneC)R^PPQ6#XNiCf5-xP4=F6tZcQND4HGd z{*=>5`JCVv`;#_*;!u9{ILz`8z{56s}UbGt4FYtVvRRXq4&g^I|8-s!yz{tohkSvPaCiv8+R=eo4;ls>*S; zPNVtsseP03ZmP?^w|ct)MS(c&D)Igo>RVo^~EJQVCxLN$yR zUaiSyiR#2>6y0N!Qi_Wf$oAe+Rt07cM5fVx^=3&o?~78)Bm$6ns{F#t#U7}qDNZxs z)T}si3r)_k_>V&fSg7eJwhAn8wU75${8WI@g}K=N;lT@mJ^~~jJ?9DtjH*vGmuZgR z%5>l|zfPc>eP9237wMCTnITq?Uepwq%-sp)mRVfTa4gl%SCoASmV-|#RlcrfV!=x> znUg6xMS&(pSTQdl;kP)lQQe8pfg3g%lPMzR+<#`>x$@03!?S3sWM6#K&DQpgSfo{* zEjYZ53;V(%olVRZ()<#MmsE-@AvzoGFFR@ELX$83+g3m2nEHH4aW!2(w<>rNWyX`- z*h!$&y81%<@6-u8kKHtY9HRt-@rd0wcC>m)8J5L;{k5P-OBQqi^gh&hOo`xey0Ji& z5K~G}KL0s>-p06WF2Z5@nH_d!DVzaU_BA0ivfn_jnIlLoE&~KZt2Bv{YZs}(55`r{ z}D+)%)tfa=YQmMFwsi~WbjK&MQi#KcMx#g7$z8_CFf??Zyh8n!*-wd_pbR^ z96S^)uBn#HkRq#bqh`8#Uu&`a=-73Qf@c$Xm3g3$-^7;ka}C`?l%Pb(NDu7kPn9xV zY6a)T#+t{EYb;xHpxP~U=DXa-5-0^4Fd5*?NO}5s8Wu2t*oE+u8&d&L zcLFaHm%(zNdL)o13#0N;$cPwssJ>`ZG`)ht0IQpAl#}m2w!mjZwNYA4+V$G%a+h?> z)#g{ZY|K(Kt&PD18TIKz>uu%>Q+$h!NiDk7cXJlD9S-$nTj1t_w}DjJ(XWW&bs*#n+<7K7+(}C^vd%aWg${BeNF#cLV9&N^ZiDuFN(#0LeP3t z3Gwxn5jbMw?m5;NeuRq8L&DULQBwj|lvdj>OT>V*4CocMh1d#KG(YQ|@Ag;_;d=Y^ zDGsX&`%_u76y5QEN470reKiJ)D0Y(^$9%9(I$gPhh2`1c*s>ti`dp5U-soZFq@md1()R?Lgt= z(_K>WY`J%Eph-7%z%5S_D^uOI*!gxN*7?`zi^BzTiDk#TS8m_z#xcx-Z74F{EypQk z61;t9w^@&O>E>p#`NE*6Db#dkeb_;703}N5DgSz&&mfz4&mqaEQ{^R`%0*D3;Bki; zy#fo7+t&^wTNC%N{o(U;`O&oDZI;CG^tkiWl}rn9+4zru_J_kY_Ti?`j{l~kztJpXl7DFfXmP(i>?EnR6l7>u&CG68&MyFA4@Gk z-vahVgPi|x)ix(ki^>7MEAgtmX$i5Mz{8coJ?_M?-q;u=*Y=p(%%-Arz09s9)q0We zpGqsbl~b$#lPwDTz#OJQO~|46PDlR5aOSa=r_Z6Z>xz&6L`zD0@OS>$aI?*ktiP9% zcU7hh?&(X7UO9JbNTAR0YQf=tB)-oP@W%>PYCO{ZAiW9FMj??{7ehh2i*_U9ng~tz z6~7#qC=VDZQchJ|WB}W?Vx=?^!1TV(_8B5=m)i^i>u%fB1ili b5a!hI*;_Wgz696D$QRchTYS99Factdy)NBH$zNzX-`RdrYhNDECY5h zlvV1|F$*^C6PMFsi=B76Tb!@Poc6CRUd%pKnl?FCC$J@aaSuy+5KekZ=Q3Od7=ROL zj?=2Xq|46|z{5F!{+V|bf7Vx4b4>TCT)cQb>f(fa85h&8c+_etx+9Bdn2lG;G?5Av zn#_~*=*pAyTC6ty)<60-k=GF~`~}WsmGL+D9MzhTQi&OS2t*Qs}pKP{GHzhyvi z>i$Mj!^#_9^IJdG{Yk*>KD4`XK`F4+kr3Z_F-vPiFu+)rUp=391t2EiYG;xR8CVU| z7PZ^-YzjIvxtiUpZ54eLGU~KYzc897F{hE$?FS%@09rpOCDM3oJ+H}<9(u!#W#Th~ zj+iPrWn?yO;sjX6mwerHuNn~GoV3)pEYwN=jeco535L1j>U)KJ(V?;M*H=p9@8De|BO@ zJ09soJc_VtXR7bjcqY|yf6KFv{!=hL_nz76y|sL9jbjk>mXJ^(T`9GjWVHRta^m5B zPqxWs8lx9*wzA$AHr5sVy4=UNFv$621`F-85THw2;x+hS;u*oGgIgRMEJ+HFdgx1OB6{`gS!WashB}N=u?mtYEUU}@ z2)P*QJG`+g_)7bv#@9Y@ZM3#%Ye{4rl`BI|$7t)a(t6PGcx!rfSHyJq7#=M8obXZx z`wM0A`$kfp=LMl(8#E%y2n!kQbTs~E zlD?z=>+ThiHyt@Vw-WMHP|#xhZL8(#Ptel$TBj`o*)M%6_DUr%bXg9ZLxnGQTLmHB z79n@V68`&XdmT%Abi=s#;Nxb8jqj)-!LbBtm>j5SFd;%F{WYCXGsko_P+-x_bzMpk1fgMbWl#4sVcp4Gdn#zZx=;MuJQX=U0t*Mikm@Q3_Sh0-0rpfP z7EYr{=kP!+BjCPi%ZR5J( zAk0<3b{AL(@z#Cf*~%~dUuE5cTS-uO!RNnSL29&;dq?3JDZY$Cx^pH(7R-_Q)F4O$qyEK|O{>17kwx@4Au;qy5E-aCidH<1ZMOx6f8Zl$&a7)G`K!(tUb>1??LlQe)R5 z%&$u!KxmG4e0Mn(F;{BH_jZKW_EsuCt#I3@xHN`emVJtJe15f8FE&%*DC6IE_02igTXGU=isa?haD6Qn51X4)M&nNBhW!;Ec*pa{h8YnvLuyg>P*)g(bPuMJn=3DK@*e#Gt$>JV z2{617Xxx&7VU5RFub>$}`CQ=9>oV+3A6$Co%#QK>AEy_?(B3VeAILdqZ>}~xdO)s} zOqFEat~L-Rxchd`dhW3Ow2Q5vXKw?0lb@MQ;ne@sH8^}gUx31XNzUv2N(qq4?X0w4 z;uC%gNHV{7)M%Hgmd8V84ayHjyhu{a+afE-a}^L7LN6hkEIGIAHmARm?I3Omc8+2O zq}8}y>k4J>Mvnik@W_CYBNIj&4Crj&9|LCLEc{vwC%&Sty8++6ecPBdbx!UMle6(` zc3u%!h`t6|6o_;2IIYaLO(VNxc`Q>$w@6XN7vTlv#g){He?Mqyz zo7GflyRpe74-^m(5Q1{=%$CGbAus_;B?tI-a=NR<)%^7Jpy5;eik(!qKW}cyQLh^* z&?xzpdxq4Mu#x8ED;DVe0qv*sehG&fi=9QKXje$CKYa{hYFK}2YLU~%m7pZ#vOj@f zkajqO%dpco7I4YGFCwPa62fEhUD=?=w9xtqs6ZmgnK8U00%79qeQgP2*yNnJ^ZnkW zHYWVNygAtDx~Cx#7}VVwEG3gNy<6S@!WjAm9pD3p^}fiEB?9fFGU35U>{quUn$)Ru zWdDthCa3%S#v6WA`gih^PX7A$Kgb^m>XfTqCqHJ$j~@XWyT870f`#8`n6;Vl^3Vne zu7+1ee6y~K?YF6B<8K(nyRn-VEIzoTODP6F@bTh zE1Y{f*hkjI;99(DH$?A?*>~YF%{1q7kQd{62UD$fIZXLb^GwHBAE&q-=aixZNFaTU zsx;wFt>T8|gK0dex!D18zJJ=?_o|!N-A)*MacHE1iL4?WOI-jW*ql}TP)qD4YSMmk zGnwQNsB`d%@iiOZWh*w6MRvPc{~l2O)(pZj(hSt3YAxM{0k^@Qy+9}9W2#Emzn2L6 zj${|w#A!M*^66>j6rk^K)0dr_W@cuPG3S}@`P??mvdmm|0CPvtolmPH@J?>pG|zII9wKu3=aDH%FwSb!C{N;GGi$5I4~jRmp8&TR~yk8y%;@pqk5b$o8>P z%Zs5SeBC?dzD1zdd(}bVSU&qta3}6eqq4^mLyO5P4)P(@qMi`&ZNRSA)=%2z$acn0 zg-ffbw3uFLh5nkb$~=I885F%iV%EBwljJlJmx&(8;?13fnIsQ*9c;5kK#nT$#T$*b zZlu;-NyEVH$14;9&|x{8lTf5!^y(R_M^MNV`JJ>ZurX_tmaz^!C5Mg^Ir(b^J} z{c~N?3EH6!@I8Ky>FRQ1+|E zQdzl&(L^1X@5=2i($|(&fPS9*W&&Q}0h0hQ)-vo79cK(P<+t6HGJ_7B#tkVsll0E^ zSkEC?^ibaY@e-`t*$;<;y(YC8Q*q?_vyB?m^ad?}n6?WHI<4s$>Ss@PD?WdC)Ckq> zdE%jz3id~jJ?;&kO#E+`*I5V)qd0%Nb=zXSjve&tHj_y`XP&5)p@SRwlf^(o!$C;~ z6N)e~X1oCm#$q$E;OpjGhQSD~m8x8IUb?}@b&ojt6B zu1D4Gaf7sF&e1N#iE%{^Oezu95CBe6ewz3va%e0d*sut}R?0>OBPoqhynpMXuGv}S zrwEoQ*Xj!T6-w=_M;al#`_CA2G8ivqk5`f~BK~4D3p!9E z^m+v7-rk~Re?Kgva;Q1o>Lh*rcyWxqz`R%Zm?hK6kL4enICWMa=0;@9LNGYD#jE_r zBgb9q0;5|8C_F$+Y0t&Kx@g!a2+Nl@p|bqfbuz^4FZH2Z%VxP(_{I5|uzSSJ$)SsFv8xf@$Em=k~k^6rY@!!V^<6( z(O7EjKOh*GJ*du~ABmrEm)XXaTc2Upd9hX9n7Zh0LIqDmr`MwjUyHD!UM;j=T3-W= z&?4+b?r$=9O&JMfeEIHgLI;Jn$ z2(MOVy-zXiOin;_***)+&p_Mnt?&G4J;G|jf-5mxKmV|#YD1?%MZTC^j~>+QmDePl zo=R{svN3%J|E!8cvk;^4f>fwfi1im*+C2BQZgg?izKvvb{A~o&Z>BKLBJ;AtA=~aAx8%3N$tuIFM zFQ`Ds^=9>hg z%;P}2GTy@0vVEe#`#fX_=?HmZuYc@BQ+`rZ>wXmBmojcwSiJW|Zq z6`2+}@9I0{o}YEs4Gg-Km2kj$+UT-7efwDmz?$o7a~=wP>4y%7G=EY>LtKY$(8R1~ z$Z(k&)Ea$=kRs?AI7;pkYQH(&@?o*Q7m~cayIXr@g0B5Us1UwfyRTBdb;adOZ`R2x zwqRGo*ZhhzPF@v_gn_?aw`a@v+Igd4;E1Up;a+Oeotgi->em+2lKQES#@&3Kqe}2u z`7acp=QMKeC@(u7^+^eQ!QP} z5Mg4;MARi&k%WbZ{#Qy_S>ZD+>TYaASwXPs2U-~h37>=HZ~RnyQF?M1;=f@D_k~*o z%>ij&=@b4%DSSTh?mrxVlh5t|zyo82d%B(&e_P{0atFS|aiv3-6l7yFKU}KQ%zx8jacSXhx)jnXGD1Z0SQCFTW$>DJiycsHT<=-xEwa%?T$GjV zPTpjo1x?FUk$@7R%2poUKyR}LelXl@Rc*2dyT!e$tXxWMbo+6>Qn$yyX5<)=yYoc` z;Mj!$e(P|wo4lKiadUevzHoYN>vtN_~-1#1r=92b3cv?+S9|A;Le9 z@ip>3{LPm!RLek)%k#Hl?e3a*60NG4j%9bcxvApjCV-1RH<+>HE!S(tc+ubl7vN_Lzy8=eNCL7+Ys$N3 zS?4AXT9KD_gB7n$^_~6ODa%hG&kv#s&6q{jhX8T!1&IV%liRJAAB!!5SWKG^Qiv@I zbVKuQS2nI?)7UrYjs4#T&ZV-uXW{SN4p564^wqrK3qsFW3+Sij&CDGPnth{nV!RT6diXqWGYu^t@d-SNVmD7P;9V4~n@X3k>x zaOFoN;eCyPsIt;H;u<;-{j--w^jERSP52e0Q#I`|-k(WzVL+`|iu~X<#*rylaW*^t zh{KtevNq}TISO~UC;gm)m3eNkeYDe9n1mIIVBfFx|CyTPlX^(knTt`7A6R*0?jDf7 zm?7if!f)@)gOb@0&GsiHkidF~`dYG%aKIG8#*=^_rbDj7-k zg+MD!NRzG?|JF?6-tt4KK$a9@BZ-ulzeTLS0qW_`9QdO$ei*6UXUZkEO zPEslx>fV)HsXsZ((tQuqBJWynzHV_jQt2%O0xZ zAzVb4FNx&B*pt_TD|Ls4hG6;&OiCTkIWlS)t+WW1ZxPp&>_aru#<2$ndc4fa}gwsVWWXLEBxmBzz8Y&(F`*#pMvacs(pedKj0t)5djfe z8#%%Lxk70~#KlV|_kVB2p-->G!uS4C&E9a1s&{rdI?IKrOECVo5{0=}iJLhs@c?bUoHU{0+KqZ`{DO`HxEUYr2i=8{UPw_0EUOB|_+q1BJdcI@&<` z&|~NoQNDFqiH(9 zNs9?6U~=ts4HbmFM7H89`P9384P4)C#xT>-fmF9e&ZwdF7eM9R$L0s8B#2?41di0 z`ZYKQ;XO-m36#%=8(C2`5pwi%5w4yot2Sv+C6xPq+<7)8L0q{t*CYRCpt6$lpNPr@Bo?u`X}d%3$^;~Nhdq^S z#!jq~Z#F+mLRwfLWQP_Cfn<&bLMWmD6rjF&i|;e``#PJ4=ivY)>6i^Ev~~LJz7_%* z!d5cFhhPO_uqetZd2b&^I^Z8;wxFBUnPg*iM^rBJg0b;eEK4f=|pw#cLQ4{zle)L7`c~d;g_X zd@PIuPx*5)><->LPH=Hca+SL3cie^f%cYXXh&5SB^!`SxK)U)9E8TNXWkSg|#mqf1 zER$Qp*RJ%|12~M+mL{x%xH?p;8)SAd2!#)^=6m%*L)_NQEkz7MU?h}%gd7cs7b;sr z5BVd5NNdlRh6R7#r=`nFjEM#ViZ2N%C&dyU?;08!upP8n;D?#oN84l>G2v#SFAj$@ z_zWWe#oLokNf7{jLZSFK7V~Tg6n~vEQg9i)8#Sb4{$9vD-JfBfOutNa4r(ia#6}-- z#b(8k^HpAbVgq+hUxv{@w7^C!+!PS~>*}WJG;I5{P@W|ENY4v3{V56%Luwp7vJ)gNiJ4x)nNW9ohjsWz{=;0(emIeU`DO+sa$79{8nb%of z+jfB5*P2$msfqAbiqAJN@o5ktf#JKnccgI&aHe2sbr1rKkt}_uOh<*0yh#NV{?3gG#L|$!y#u`e%Ry>6k%#h; zIZwoYq3wsWf`kMUWAS-4kj>YV+2Yt2pfkD7Y zn+m{zh;9T9R0A3`=XGHL$3gL7d;+fdj18Gmta%ZJjv z0`|axzNzO0>+h((3UtU|fhy4{WDp%^BwrjiD#wW(632D;=pE>hRRKaJ#Yd9A2o>6x z0lv0=2aQUMh~MaowxJylRnZp?6-b5Kl7b?Vu!9rz3-DcUCI+)6gurQqKCV@6&S^)| zni?Pn1Ej+@X?(Fquz%uF^z`@WtbCp*kw>mx%txaRz}KD;Nr6&^!j@pyEa%&cD=zu3 z3hob+Ncj?_5;Zhg9#VS;2koe^MW^G(yCcKgM#Z$r-LL)$kQ{%1T7yh8$sX*zd(e6D zTozueYsJ&*e3e-cSBInEangtKxSpbn*IX5(3a^=dQ?%~2uU0i z|1MNW?;I}ec2h_;5M_dTFaWK2c2%$jzR+j?FCUdg+`B69lTYqkAczyI5>?l+`w0pKuhrZy9c`K3;7L+e2b9E-=SuFGJ04^IG<0bTVA(z5{geNA6u2svUY1m z;OBcC7x!Qj@1%;stp0l*Mz7tQL>lo=Gx*tqAI3e^r_d9bkUWy+eIv#`BDB7aY00{q=gmR3KCGglf0_9%* zuLb+`=C|86a7F}C{O?y0x>45QS5Ar7Oh8#i#2Gty2@zT+gnS_fLmP}lg)}0a#ye>* zgH|cH9Rv^%AHMSf1S>pmUcKfxH8hiH-l^bhEoFwQp~onoB3cumRy?;jZ%DbQdU8^*8`k$zrawl&l^Kh%b*-=re}|oM$HhMFtDuq493oMH#3Wo!sGQMyu)@n3adb;+=torH zYzBWG4Hi%y%pOAlvfSw9?<^w$r`6DN1f*NYnBRTopOc|lZD586;6ev=c#yE?3f;I1 zcilLA8eJzS^nek?aUS)k%$~hvxTTWv*QMFvh*=}K8!3YnaglDckz~$TnxTtPJtG_ zGLI(x{1s+?N5*4PQgxTQAfw8Z>2TbB^;>rra(gx%V718jXobX8Lkp8_x`vTLQ1$9@ zZfI#j2#5T#t9xn;?3~9sA<)Nia88G1*bf_&Lj(}^=w~d{c^S`Za=t^*oV82ZYE1EN zv_zg5L~=8Injh+A@1(_?yC3evZvgxSWroMx-((pp8BwkNY{QY`gpLg1v5~t|Iv}mb z(9{Mjd?wy)V-rOICWv3>dZ?#5=C+O;V8HR97Leo)vJL& zxBh}-$xfw}qts9 zAPDL@ii@QLC+``?O~n)jWb;Gt_AQ$Ff&^O2BXhXu=B}=TD}~3F&J6&IPcb`!p)v)7 zok_MAPkC6~hELt}s~qy(F7|8Ga`U_4+Bg7}2ifInWB|jC*B81Q^wzjflw%icXd|D; zz;MJAU>?IYc-OU?(;1K=Kfdm9<_5Mu;q#BcAjzP&y6D1C zi-t`Hw9pbc%L!druNVm~EMh}0p0T_H7?q&RxvlY!IP5>oA#9Y}A`YPn7pUG7H46bD z94eKiz~7w!>)H{qh%WZ9?|A$_H{_8qPtpDVgMR!L`v73|-j(hg0WY?ozUOe2kwkpL zjyx{(Wh3yV1$vJNSJ#NF1N^xY+JI)}n< zK_SDGmC%A0 z^FhoQNe&aG@6c>}bYYJCUy2tTxba-sH!@wJoM`I|(ElyOn~+$QHX(2QXSX$~ztNhc z-T(9HbeCNWHsbS1o}K)YJ&sysr9fQ=mT#U_hc_@nS~mP{@UK?1JvQQ`fQr*fRVxg0iJ2$xYtH!E_1lKX&IT%V4z$o6OyD90QWvl%`+^XcKlhm1 zt#n7P$95|#_>FU|s2~{8-v7qrGKw*{Q=)2*tw}!4zUM+~xJZOgm=J;zj%OjiK>Sc# zSt=wz4xb4x1}j@2SxphU(9#j*Wa*BKkmA5wD%2uR~1aa_Y)OfkS5e zPyiUwL7_NL4y7-K$Wg(yfOf6P7)E(?wgrvY71aN5fv#v|o2GIwq)p7hi0cDZ=;sRS zC|NiMvEg1iU&fKawK7}F9AMNjVV^HdHQ3EoglhiPZp6K?8Z3SGSOo| zQTP(==eEp93N@|Bb8gI_oS=ydF1OBt;0Mg}K&AaVe^XxkC8MZxUknidBfrzHZ*;(_Z%OBlY)|Rl ztQDBq=PcTBgSR;VKe7=yzz9ofzz+oKMM}Cvkg)HR5mI0JuFzRt3gdY zo~vL>(sR~iD9i=E6q%%(`%=3(cfN-;V2KJ#>C%o!R;XSg^#*my1mH61XRK@zYMUct z$?EYF2x7Vm26E|CaYxw{{bXe@$ruk^@d|yd{O!%jP}Uts1mYvc1T+H}p|=l+BA(3$aqQzTr|rORcg7IhSp|GK z^nS`H)pyJ0@jk3eL8)SYnBB?2A!4DG1cBP5Lz1!H;`$y@)CX0@EvLwbD>W;tqGhi+ z^ZKAIXO4C~hLxaKvL+{3F6)!O9nT#_n}c_6!=v`-Y4uVAfBtiq-B6`>V}IYD-15OE zq{`5Wx|Np8DM3G@B?FE+nsBjF5q-}htBR=9A0A)5*^*?$AB`7yjlaDuZ986mcDh}h zvvIlpdT?}9V>Rx>E7ChF%FZ&KTnYaSwucW40M;PLCm)zPcqB!i%Wxj8SNfl8MDxF`kA!9u$wvFwbh0woJq#$T7Hhg`#|l`mdG&LZoX$@;~}{7 z{PJ=r>zN%hAD@<>V9M|fIta+Y!;{`2ZSASBM4Zk#Qdgupy2t4N_yoaMMKe2|&{hmn zrAjJE04>QXX6p*(${HK-E9D2S3-~=QxC|^dTq(=ShL)6-C8{N{VvAvv{lr&6RZrl5 zOOuk-jHAB&6p3SVxxY~FIr!SxZ7#3g=_jwa&sx3b)mTKy3N}xl=mBw`326z z5#TbNBn$x)+7Iqss46K%zgue5_&_fs9WT*jp!Tj*40YK@XU*dI4b0u3EC2PEo-|EX zaGfRt9t6FpK716v2a8ROYBi#SW0Ww>%$Kpo;s=%4lc@FCrc%jv;}SC)8)ikqPF7ND z&nbRin98SUovX%Jhe1?KShe?iZCvii_@X(e`uL0KwCV=&~f{lUL2x@BT($Z3vonr0^SjND zE1Q4ShnSZ3K|r11OHMW%#KBaIqzSxz#Wis%4l>8-&H>Hn6vu)_KU8^ z6T@+U+ZAB$C@=;1>yo}P1Dqg7LCSxQcJwLl)rWoyAZ@;xQgP@q7rYM#N6S7|AUxZg zEStif!!`(_bn%|`z;y6_2)bkH;%U7T^>{8PNdE5WR+wuea;qSHG*Ozd>9>E})P>Sw zbnnugQ9;4{Qt#r>Ij0K>1O9c$80pZl0X}3!j zigcOBfa1k>3=5zYJ=@60^^BIRv#M^p9^HaPF0-Q)CP$+=ne8CK1sA8hw5%-WO7*$4Y3_J)@Bj2l!{veU7ByZ%zR7I~wSfu3WSANub{$wJGd0L~*vOTSi940+kmow%()A zUVPDPRiY-7X*sjX3wFH1FQfN2XExLQ(Kb6%EcNdH%-=)Byl&e(QE6sp1VD%kk1J>m zc7K51pYKHiV$iPE2Y@9atB-+R!m;25n~hA2!23uElXs= zkHa=y?6r^pwsnmC`35ELg9X01x73z+sA4r&>Jp1i%#U5Fb1uz~VsK|VVQA($l)@U) zOb^H2mJ?eW$rVrefEzBmaoy}RDCc*W#dhpxa#&PDPxk1&eyd-T7JJyLXi>OXgNlKz z5urB@SO;a)WMkf9jHRdoJ29LoL}@*d8(PP5qq%0T7{|fE$~vz9goW$FDqzG4W{LVB z+mXOQOA|;6K&#B!<{tz0PSEPDh2G@-M=Scl!9h*`c&sl>T|pU9W6t~S9kKgwF6V2i zJDhcfOGR|#bu9Y)|J`uSEu%lo+Dd+UDrDNe6t}grRhE4;8ys6_%zEiH zE8$X9!U{e;-5la+;?`Xgv-@YOrezp?egSJvG!ZI%bdqZ}@@3KOg(KlG6)4QG0G}E$ z5|8S4_WK%m^y5!*!Tq)pXZiRPiyAZWU?A&sJKt%;&cW2;EW;fX?cX$-QfXy_dDwFN zU{cYlg15!Iim=uA^51p?Z}Ix-tm7a>B0!{1bqB3UO*FsVN6RFhE=%S~$QgwsRm$Pv ztBOj8q{}hQtkfJ954-XD$jo>J0J)R5ccUuz_rM!>Ixr-oJL z*!#`E3>P4NQ`#57TO!0OS`r7*^-puZEHbhe6+7$x zBz$Vl;Y3IR@nZCvlc*12?$d{+AJw5v>F zqlcE|)XS^-yRXzb#%KE2`!{q#IykFqH0#d~u%BNw*P1uedC=UUSTN~M*b_5`19#W6>xg--<>rEA^&5BIc7hTE63=U84O%@U;?gJ zNfA%~jaCVsDU~7_E;xhZZ%0v^2D2uvFYm9rJZIhRGGBVRtBHx)%U%fV-sLl}>er18 z5}CL?rRa{0`RX?z9~Xc<&FfCOn89rJo()|2Zjn``V2=lRW6kjMQ#ca8dzd8G!sHe0 zD6^s6ZXpBx1veSUt!jHb$iJ@LwtAC#e%csR2yS3a;eQe6*kGiJEey^XN&cfXQ4Z{* zFmt9(l;#fOi~G-enAwp)_7Pc()OgHtu6*=-*j};3II#&PY@F+KO1Wh{Q!*L@)@e{Ll_&&D{VX zug^k%#UFNZg!;U&0XbloiwLXg2ZT`6Y>fm7cd{P}CypbZRG#zj@)~5yd3iMqqK3n) z?E%?g${*P2g2_gWWi3Lj%d>Cp@fiKJA9BCSu?OwGw1? zliKU2?=kv;%>UlMPVL#jQ#OD&9@mt=C|B1tc(=hSKwH)-( zDD3UvsymHSZy%1cBm*W8X zgEc3L%))xZ0E=T1oJXsJd*L8N&&j>M$cPn@R{6zYc6|RQ6UwK3pwLoY0 zRtR-6+;d9GguyOWFD!4-gU~}O!_%MPE8nl?=5I5~dJK9GE7>46&GIEylS{my1x z_NFDSm2!e)*WEq%`gURUJaBgr?Ksij{C@E0kvhm3DP(BBJL1a3UC#OvM>jbXrQhX z6iB1euY&dSv5VU@jOh;rR)C+RL|RZ-*>9+venyJWIyQ5~!0+h$VBxK^LMu{S9+pB$w;aY+!r)Q2 zjTFS{+yj&Q94-dRzoHt@W*m0y2RSzQ0qp{*0i!iuOr|hPlMrE%I<=3$iQeS4l5H0s zrluM6v8R@{NNuHl-cgTE#m=tmlxF3N89 zS&ZO*o!(|U|MW6fpH@~-;xttdf{Su7`A45x+HRM# zT*?u&!%`#GSV)ld>o#9_ZqMu{R9uLjWJ^#ZuBp==KYS>b^Lw#4wIhftkD3DM`>Nf5 z$zmDl2P*{v@doR{;xwIjB^*H};RFdb2}z)@xYl%VD0rB~jRsoq7>nxiOhF#qetW0~ z4zOMwIWjJ0BWRR4a8B5(DE*$M)&4Y5Om>m=gF2tOf?0((vu~kTyNu(0%1spNta0RP zWk2hG@PM(+6(Zm#6kJj=FWxTjnHhO>C4<|bZx@^cn= zj7_AER$>~pjqiMOv;N%D&CmC$J$YHc8HJ3a|5vHwHl2*>Ay2StOe^C!n+aAR%50Yy62(#<&LO0^z$k=e8H+T`H9o2-qf*cS1FBr+oH)NM`7iz_=*# zt-^ris5+7YtcKmVLzC7u@R)Pr-aQO7&4lk#t`K!`1c=zo<=M+kk8L`7Ts@y#^Sykn z&ynr0$aNIZWE5xI`pg3zAlONBAM$gI^;R6xB3xbaV>vWz2rtK&DDo^eVHp)zk)Nxp zX>U22-798QW$4K%dY%4*Jt4;sW_prrk}Hh(?+f7rwj3B&A6SHKe(&o59+d(E~4D#im;bT-Y4%$VtE_KcHX10v@Yi zf1WR(2MgTfa#<`OXG6Lql>+!*h()Y7&5r&}4+^oSgIRP1dW&hZboP*eIcbR|e${>e zCAP<{A^_yQ3@NjGyL6cnj|dEG|K1n~Aum(fUW%7p<~j0jHjFbVPq=9hEI1%8lXRMs z%ik~2;r$`?)A^_<8}qKjSLz$L z422+gQ_rdsQVnT}h5K}HvMV$D;bsYohw@Z;ppo}0OPX_7=+xLE_B>Ll)Xw@L?clg2 zKOz0|Qfn@h;sGp}SC_W)w++x_>%M$hAanPh4YHn?i5WhLg^$%(`dnHR_ssJ1|p z9P)2C)BrFn4GXT(4({lv+%AcuP=<*hh2r(!UJXOT`5fzlQsEMnA4;tQXDS@$MDv~H zfz=3Z@|VhoGZxoiXRHNA+?nBGU#f=lU7A}(b{Kta820{xQGo++9-HD7U2gC^Z}tVB zX-D1=rSnv1W}MoqJrp*0i`kgu2^Ov6+6p|tD7z>2hv2}f2Pr@V(5`T5OIy?NQ;#FX zGvGOe4|e8~%TDG=v;^KUL{*fj4`c7abM-8LNuG(^L~)U*b-&6Xis@y)fAC6|?U ze(bx=9Cp=l_T{Mmvxps_|BL`t@eBR${x{c;eJ-QFxbE8mWkHJ#NC8VWx3XP!?%4TE z?HAfP;~)Mqo0$sVtd zi~Y*TtY6G%pXfH;_2U~l7F&?cU^><=i=nSQ$0SE)iPaH`zCwN0z!138I$>-& zRd}Vk!duh83ZbO=ot5W1y?l|Uc6tr7=^!3{&fDkaho>BA11pw+3{px|lWBcYLOnTV zs@Sb>=Rw*F2L~pCc-J`T#u0RQ;vWX2W@K*V2rVH`avea@ko1=->ew@~tig ziv=k(u8H)8eJHt^N1G1sjuGp~(0OONTT!6>s+ke}-&J zY%&K`g6FGQ!6^6v>b{;Wz*3aBb&zbv&KJ_|4ackaz?)@XLi2)-tX9a)QNw+bvti)X;@rXoo~|VqM$N} zwrO3oIu7(#&!(5HB^-XJtteTw+D5>K;RD>ihHYj63#_rRO4Cj*I)DholGFV|O+FUN zxKIz=%SPzhvfG)MW^wA#$qRqD06FU_6iNfdpBMxs5r;;Gya}~XTBTK`XUAdGs zgX-fPKYdAl;xh_dQsLoTEshIRbP;m}H%`=a&pV9+!) z_Wi)MUIpU-s=~NQcL#I8G&`Yv=`F4M#(??Azgm51zTK?7yp{dCsn@H83bGFUP3)HR zrGEE~kS!hFJ>40svG5+;FEZv=ALyMzGlLo!c3i);z&=}E_SGK8$JdVX{7$F=gbu;c zKTo=*)G@mQCi62`Txn6Y$o#2Xrc!%sSDhzYC&x|nCtS=z?v=Q65chFW5T{7lTRJ2S zL>Ht5eE<8MBF@PcnIInGr_Y}0LPCfhPfkx47l4##`4d&Gvpx-yt;|ibU>E(^WzE(vL<)Q-`-y@F9CTQpnb?Di8)7zZ%=}MYFf(`l$^f*YCeaaN?liG3 z6rynrnfbk${XoVUU8zveh*-}OzPSI{!kSG(ajc$!|F6_RSL>?N6J|Q{k1*b10s6zAF-jrw{NJ zV`kKu)ac05@?^@My>JDF&IuMRwByLR8}=u}IYDsV2(Y9xrSDmOV{+gx_eX5himAZL z-|&!~G;fv#b=C#}}W(7)^tO)`o^UI*M)431x zmeY|!x*t3wU?v^AYl=4mRxH|g@CU7Vb`x97G}_aAB+lsJgQ&er+y+na3@&Lr88KrF zSpxh04>Azths8<`aTrgDq&6uaz$ zT=e4O3gJ30wI}x25=_ki@E3}VqRdJNXE((cd+rzt^k#}c^|y@JhMR zWN;kdf`_dD#k+r1$p5G!5XHqjJ(5X#j%uKsK0;j;P`TVLU%xAE#Vf}MyQ2^KU+!Yx zemB43plKCj4wxD_7{@p1P|)nzBQWkza6x~G4*6c~Q5LzGmLmZ8Q-VRIUn2n{922qg z^Fl}xX!BQLF)|6_T)n5olJVGp`e8idIFeTdXv{@b|1-%CMZP-YB<>%mj}=DJ0TSVy z9&&kpem?aLtRtQoN0{xG&og=88ZE|Q)SoiLq4>aaR@7{*O|ovtZ1f(fDqZ8fiC=vG z9^-z(BFBiTNZ=*wXS12+Y)Qk`zAF!=Bt{Q(?2K54;^&JFadeP_$uPY$8Y66sXkB4E zX=Mf}f!N4p>xAK^ns5FCF7tV;>Eyv=<;K9fC8e=TRN0VI$l>|Q*RhgJ8h0TuTezWa zD2yw$UvTE>?4ups1X=2-zT|8X4R!1i2z&vgB%W`bq7?M|_wPYbWClkhNCB#0-=1E~ zj)#@vYI7-lgPO?3076Zhgit3btCg$c{6>z~aAdO8#z+N%<>*L)X^V~)zIyb4@x{g3 z+CGh`MGOPcYd?BTL&iBttVb(ca?0DW=k2vZ3`1BB)hM1GR3FH)?8Si)_V)|#uPGtI zIp{l|qrdJ5EBHlgR8G&2UMgtx!P8zJ5>tzG2Uxc$mwCXB$6m-0Zx8;onDF4AQ(G5O z?xjD$FNEIU%W`+a2T(?~%2)bgMk*_-;FY=*g+ zewK0+PD~d%;J5!1B_=F7)MvJzh-kemrEm9sL3^_}uAC{KNdKD^fYGRWkssa84kpAoD$d`T2a3%(o8Ed-HjVB9Cu6 zX!y5$XO_HTWu7pS)Y@43ZZ+~wm(X@S1eVpPD4)Y%8+00FTdF|D?5K$AbCCe4>{C4b zss+2BjMb%u@kZ8zHFy;Tz9vFq=#9?idC69c3x%Vt)M{ydUN&@~zf0Zj*9>l_Uhc!3 z?YxPZOfh4WRvUF<%V{GQkCjfdGE>#!fG4s^ZWp5_~Vwy%DKqtMl%Vm1}@_%e0Rbvpsii`A

~J| zW!yk0s8ILZn)%eM);iJuvSnG{7?Z3eQwn`G2iCZF!{Yx`+(P!|?Qg{S4CnQp9pk{( z6XimB+j`!Gv(>D*fFuHqu!%>bc--;%g#R6O#ivQ7hX6AILf&E9i%%S~C)nQpm?+fL!&_eY7VlZc#xGB2^po&CEokd^Lvtu-uTsrk zm5qA0-K$PaOnQ<9?fU2|lJ;At`&Ik6NpvHImZ_+pPL4N+1G7kER1@yQ2ALSfdzkmNezPwxk1+x-Yijr!SUNOUZ zHSm@7?Sba2zix!2IWX+Ms`}d-ymkJhi`JXXzG_sbKI7r>3&B#!ApX*D4jldp2!(Su~MI9GPxbHYGipQR{y{dgVNW%2iG~w%K2_L+~J5zB zR?^innVrUhF7l)f7Vz{Vu5pJRZ{8-Nv+hX$t9!#BGvu68)!0qoVwkEU2u{S! zPKovBPggl3W=66vjNIq z@IWdw->b?_&+-82=|5c7XB_J#=T23 z2Jaw=m_}BeH}g-cYHqe4d>_&A?J>${Ty0?!S&7QWg8q@Gbp{(_izQzJBaL6fbzyXR z!Zsi=OEoMSN#teenXn7C5~G7#p9D@nitZoOD`b>PI@>Cn(jUiWzGUqBS{s&6x)hBdlb*cDXrgjGBhWidt_Y zzyWu`ZH~1PzfhC%@XSYvbHnLV5N41QQquJ5h{({_EJ$|S2g-GNo6zk9mN!BY$?^`B zaj11YP0K@mDTF6QbW~2Kz9#co>aY#Fn-_bW{6XwnRqPOxWI4HXA_742{ZD>SLm#y3 z+uGUvAe?3oj7el-wv2nl4A8y(*;YzN7Q!8XBTue0o2DZ>iGw1;^LZmlbF<%ir=it) z8=u-bo8XlPt6erJ`;VYk%`~PruMDd?Qhj@OKwnnv!^#m_{X5>%d(fPxcq(6f@b-WTj~t=O-xi`8wu~v5%pzd^pK{+VyHc2AgHYtZMWKKP-6mRqj}y zqcgaSMDeww4sYzdA>hOWb}-Qn?W|ev4*%Oy<&C)ckHk$xaYk|k1PSw2&nE!ymbmwE zj{n_BpX>fiRjzwaI8>#JNh6=#dPTWcOYCLTHQ*?_)M}xm!1a{7!|`FFNl;Ii47jq4 z+<$&vH_37?R&#+{r4UoC9wV(AlA&nthg;yN`2o*a9B*nB8%J_^L*wYH^|EN0)_TBC z=IHOuResio>`*Kv+yo{Vdm(Ma;M~xN-_~M#!D99&w;vCn20=nvJuDgVq>-Ql!Ss~P z56^JZdRCYj_=#GLb;?``-m#>?m?yDbf4We4J4pN~Lo~L=vE@MYN!Etb0Vf=a%pDUp z`-uc?C2rnDM@MfgW6;Z~<=LEmTO-*OtvauM@+$jTHSEtKjPLJ$7FQpXRTNe=7`4J{ zn}m8p?;^7}Zr%(IE{_pt74Ze#VfvB}XP1(vjD#v3=vs+}BJO^_)%#~DDZ-|l{eE$J z3Y#WvpO>wuZ|i!b#z$!divBH5T`qD=1&wu}BhwjERCTW7BXT5>VWa-)ryMn)NGA{sUj*-d^j8hkKNl9%iu z_N)5VNfB2e1Y1KXe`=2#lKYoGTrp7d>G#Vnq2*06{tcNSbqLa__aO7`bg~Wz*0fh| zb;Q+I^#y-+=5=>pCKB-w0}QPiVSCm`N#|y}y&Dov3|CJ~de7!RO#U~Xb%|fdIP%Bd zk-gVtvDpjy)@LP?Iic6xJP+0GJdY4(`IVaCwx!{v6Sxl4Q`NYHf z7`e9`KFTxEC^q;-fUP(8*za2U-q}rbps&?)Eh(UZ}+P=^*7w^=#{GsojY|G zanlRjU?aZ-pH|FhAJoOvJTkCyjz^Ez2Ur8_6rPdt*=TSO%jB}0rhf=3`n;_?dbF+% zdGsZgA<6~2Qm+(|U;jJyZC;t%c2ieuH|4Q9r5-V>Bg{@6kAm>`$GavsDrym1JY9EI zpjs5;38&x%dT*tmmxNbca*1U8im)h1Z}@NLxb0v5>&Mx?yPQ8z6F=fti*2qeB40?l zXB#7Ib}X-Njj{*kjV=emenNJ7Y0uqs&XSQK8t8O9g6ElBh-rQR^-U|d-bz}g5;BtD*GAj<*?o+Na9@CEM-)qFC9?j$Fwluu$Dk9u*uS=M| zN-YU`Fc#t|^qVc~WoSNkgpI&DUb42VSSnHwU58YIz@gPOHTGR1F3B8T#XxMY1{ZNl z2?bo`tt*1t!We?;9F7rc@})gHVRBK)zQ*PV_6dGSV4 z)f}|=up+wOQoD%(*W>!e8d%5JiA_gwX>#z~?J*x>$d=^z09oLR^zXR2pot^cKRiUB z(`McXSjWL=%zPDDA^6LlAt|?N#x#wyA8)r+1-_hE_c-VSE5kkySCbDisI*aiNe-ag8n*%Q#f@(Hl7XgKjXsw0hgYw&35x21RGN?IALyXH3UqGEJ1YN z6QK)I7ZW=K_%dHZRL&Eh`X4rJt^aGL{P&EOlK{S;5cB`u^=5rPO#O7kOExv$^Ds?t z{a~AEI$ln$;7c#}VFY;dqk#GM{71EGu0$*T^xyAix7=G($%#z1{r&aRT7d;YkN<0` z07`jR1<|}7V3L^f<@mwK@9llB2SAE6gQ0ft;q$O48faMf&vZh+-oKA^W_uAylgA;f zJ14#c)kSj85{#h&gfG9Y+J_!(giFri-NF>LT==l@bZB$*i1lc507+ytA>Gu<6-{-0 zrqM$7^9b{UFOd0i8PSnMFsnMPC12ZA_+R@%yW#!)MR7Ie?m?+?bvbxu@`K(*Txl*h_(OD37%b6OWPcEU}?)k3>;Tbn*H8juTtcBHgJCOn~M5|R1|4LL@TdMb48LD6R>3|^ zDMp$=0d0zafFQg89K#F)1RjJdQi=}v6{~bzxtr7{gw@NMTR)N*U(ip;FpW$P+Hj^k4klwF=cnTq$G-i~ zn=tYQ-unWbdYeK@uaP|s<-ey@oyy?=S9+wceMlnq##@Xs8;#)cTbT zSlGxws(3C@bjX&R+7#TZcCWA&A%`UoCr~}Oi%iiw&iK_)@IR8j_YbX^(JtKEsq*E^ z2YGyg4%kxK*tPENgBxY~;|r7QZi!m(#}3#?e-6Jj%iHz42)3uha3!%Df1y$`6UZZ} zeiiI*@l9XUTo;7@7LRR-bzJ{3 z|7bnvTN?k6ewR8;yG|g58}aH9a8DD5eT+x`bn@AMlvs*1K($-Y)hnL@IcD;DUjR-P zlBud!b2N*-J8If`T)2lB&I~U|LwBe?vi0^R_D@ER%`gZg_n;a(R4OW`h1=m00ElQBu(P0 zVbf~i&JT60X?E1*i!VNE#Yu%p9}DpF6@rr@kEGb^`(lFcl-vJ%MGH#n(w-4Tx?q1U zZ(s?A$DLADhR_&Lf>R)5*At|zfP%h3&BUDJfJH>k@9KVVmSLgNt0iAvS*h)E+krlC zN*&Ht%5GI2K$-x2MuU-NW;+w&dW)m-Se7l-8MPNwxjclFS;bKG_xT@_M-xgIw@mzr z=ZKS^JdVU9bz()8Uv3pvOJ8AS(o8HxBdJ?B-zI4PsV+68JS?$pa6s0UN8a;~7fEoBP8HEwS9 zRRVVMFm>!HAo^$$TbOMB;DA9FpGLlOZl?GvRa4-c+h|U_#l@Z(=1t(;8spGho!vnn zN}<-w2w}2>dDn1<`ogy{Q^++Q{wu)o8|jy7FX)~oT>msNcT1TO^v#zIPyovkP0t;Q zCr&ubI(5b;q_=#RKV5dBj7rVV2Y-Jl1^-)4tcx0q4^zRUZ(8~NCRoyHX|PE9J(nV z-Ja5>l1QY=Kk|o0EC=`DtkfBBmfQZc_+}tHtyMwdiq@D_FURZ@XB+I&x459u&KtM8 z>#u#Te~+x^ODnIx@yj@u42k(GVvl-l7BQA9ba4i|)3oA)p7tc5JBSczm(`NDd<_if zN{(MB<%$X8FMwQmwlLIC_BT;!Dpz# z(HO|c?IJuad-@`|5;}15q4U77t7L=t)eT0R=`dumYpN(KU$V^_k@pVWU!Jk7hl2&2 zdMwzf$BqVuct&NsO2N>~ z9xj~2>L$`W@yC5gXOJ(v-;v1ZxDmwYEcQCSO=I0#z3?f3J$3h`9E$xo8^I)LPkltv7m>{U?I zmaL!XgsxV*Uuj?>lRNr>zQ^eyhh*3FQ9?2_d677hI{8u?>-}{^?smV#pZti`czk1A zLd-PzmovWHoJ;2|f8}WVfFZH!)@$h+CZMtTB(gNN{E#Lrhw-n&TT`Mjjxld(zCC1UtYO79cf3$+kTRKJ{6GP}QPTi?2h^&DPy-AE224(hd9!rQXNP*jGm6lI8;CsO+ z!;A=6)4M+HS8bY_1(!1h6`H>i@=tS_{!PR8Z7RBF3_CB7dI5py$T5>Sgo>oKsMlYr zzWxMQgpi9GO_snK1jipPCD(%whBm#nBx=e`%45_MXvd|2%VI2eMe`ax4To+YM}YV> zkExE;{?m(2G5MAhbSU#W;%RUVxne&_hLo7WMaaV~MVS~btpdds(ECRKO(z&mR>{Bf zlT-dV#fe=22FNz2`ilUxDyoG6lp8lyHC=J}%Y`B{gr)xX*6SsMYrJ!Gam61MyjD}1 zQ7b7IObI&r69LTPZ8_s>c@qqh?$AsjaAb^sy5o;CIv>k-a}juje$ky(Ez{g|9!8}} zHQ^HfPNHmtzTOZ?>-iK*>?w1RyIKzxaMG|4lXj=Rr51A!r##%=GDc+w?aa5mO*_7hS&4_`^&8Ee2g*?!j%G4{>glC5c`Qep5hWkwd zz)A-FJaNwq?4q~KS^b0`L#1)2z8_|ZkhOk1bWI5l&Xs{L;`ZY`27F4ah@lvCs5S27 zFtiXI$Y_k4L9W?IVeA=C)eTG732(aA62*ewuET6%%N?+RIs>Om#Rlo}s8)s)U#}T8 zHX5({@HYyWdo44Lqvqu(3YH@*{BU#2m#5VXv|C;S*+?hf0|jo zGJT3u&RBCo;EqP5X|FhS!4Ip3vCPWs7iotdOC^$|TNR1?i%IzYL2NiJ-vLwrcKdi< zkHe5PG%RfRL^MqUz8JP5NcKWI7j@FW;ohU0YrTjonA4R)?cj0VQdoWZ*|@A+t`(F< zH9VBY%@j4jjy2 z>OYAyD*R6dE$NQNc$M&NMhC%!0oj9z_T{g-3}(3ioE zVV_OU9Wq8mP0eK?@^4JM66V6{F~^2Q#7y=$z!l}bmycEiq)&p?q1rY&F~^kavn{TH z!FTbTeqSBLB_vwjjkL3#8gAcQvV1jD0HS%!eUtX4`(mAL260BlxXjsPAt)PaD{HO- zm;NcryH;u8mcfAYa&U6BS8!!kQ2Zk-IE3V;*G$}=ZnWu}wkmdxRqM9<^Ohi2!bCZ4 zMwIOZ$4$LRSGd0AucuTlOzM%1*nv$XT0zl<&sL+Cxp1%Dn2FO18;=K+Xza7cL9TTv zfRm(3-mwj3@Pv4@9_;_+lLii%5?c(El>+3xlW(`&(2nA1T8opM6zO)Z?$gzhd|%c2 zhNK)5^8IvptEHf}{&Dj?(wl{>gEwRJEVfiu<-H_2p*uBb^U9Bj#-G>|!69+SbD(>Q z-*}&NWpfbfx%BeA)jvUMINxHrozte%$7yN2+d%pKlqJq(MGkIKPvKw4J5mBHM|&ex zUs6Rea=w_X7RUmS9ak)cYs?$Lw!)c^`iR5~L@!i|mq;ysFYdP4@6fPVQ~>^mmVSkJ z$7gP6u;M&rej8!wiaF1^`p0Q^T>ao%@4{nzDS-A_CW$a}Lmx1FsSY@Ldp)0#J-9?^ zRY)?1zTvN7X}VX`s%6#~<|9!iQzz2ssKM2{aqDNje+pin(zI#WTx#)veJxX^%;-RZi@QrcZC!5+i&D(fY$41E@x)OBVdo1`bhu@)&M|LPpp846 zqf5M&Y0nCqN*R87*Ss@bqF0o)eAgIZs?}}CBFICWqmto*y*;Jm0qnR*HR}SD5uoni ze>v;8xVWd2dRhSY#n*n7M+GoSd7S{rKeClJ0n9k+8f4Ng>kreaA-gvLOj0(PgF1Bb z6L*QD7L~p~P(8Phg|;&`UF_{V$S9Zomm6iyo&+EcKt_*9Zxv>;3_&0VN?X%0$$ zYv{^{f<;Y3O7^3^+yu-;NT<52I2dnwf+2ja?1X; zpa~68(|Gb-(JMZ5RW@DfOug&!WWUN-N$BklbQ}i&h7!nkT)#_wX7iIhD;^qrwmHY4 zDMc<#E;}(@4$wBQn=a)F7A7aD!YS@GIc#8T(^@7NfU4%g>><#6KC5BWRLrnQAStAU zUdH)qi_bW*1G$H=vWqls3pDYCnF?A~$ODswB#r-q_xk+3B-&mn_c2`}o=`crramP2 zVkds*m{yzes3WjN<0I!1DR2m$fNKpe)#V-1v~E8ZRajqI4*Y??sie=4<^E`LR`qt% z{F3cqL-+uzDL1=}dL2lUcI2`X)1Fr(r(Qh0)th~XoEgoc^KiIT-J9?tL~ za8702i(bBw4*2WRrNruCLS%F$`305mI)KJK-1Nj3tEI(vXzw=u@B*YMz=PM)`RBYR z3#eX^(tVa{mtPADxkq1nyk!dba}X;Y7o%&R1+Zx!DC@yV;aY0-de0bJ$x6k#kb8gs z*fuQ@*??LBA3ki|?n6j_{Gn3{#99md6CA%ftrs;zseV*3bL?dJ7Q8jb+=I zg>~Q`1mKKV$E&gdt4Vi{A~1^ww}q`I%|yr;A{f8GvqgzvZuX+b#r?@W-(3H5D ziI_|c`yKNuXBF&j_Ln-@F}^OrF{GYw03`J2(OY?gEnjci^PUm-cCywtnO1X&8?}JX z*gJC^DPlME)Z#e>_$9G61Ai(TSn-Cf^66ahyAWdxAL3g5bw^+uPt*N-vhZwsu0+|k z>rW|}JgN0T_mz!izhl$2uPQ&$O(YYo?Kul0-(4}$xVG*S_Gb;lEEJZrq)lH+bPdRU zlid6e)cN1Pj^RHh0cr|-aKQkv=gZ8!kyq8#0(7O}UH_TjF(=@y70KHt2&iCdOB6An zei}J5ax|@mIKYH@_tJO`)hS&Wet$~MH^(a|o>!&?M6x%1lrG~k$hV=NY-dPGQbU8k z;Di6`PXOiEhTPu);|UDUi1cM9F#M{R<}l0VN$xf z;Vh$q9K{nmo}rn_xyL@Z2kps+wjHDGTN93r7oYgs#XeSDf5iE>iQ>aV^p}W&?so;3 z58SFHT0apECFbrE*k@qFXb6$%R>Xs^oC2+nqvkxDX5(fM$^q~cIz{c2{0q5SQ~l+q zhH_Axl}!=G)zAYrVjveQ3k%3F(&r7SC59g}0gmhsi%{^zC&noBKTRgJ?&zPSZx7fE zHk5M|k;e@~W0S$wO#uJV(TZ-O=WgNVIlOlsG-bgE#7EqcoAvp>tIdeo^O_1`3M2() zH77S;{3JuBiGfs=)?cX5e?l2PUTLYbFqB9Q8#10&ZtGrHm>Q&GkL+8ae=0*Lhf_c* zBW7>Z7Ib;yF!N>XP1E&_ic%~<`is@F~oA;)qBO!XLnNkdWep3NZF5c z;Ew^$yWn>b%td`!W~^4^OHT)%5FV)#B4{opZrhh`V}-@|Ap}axhK{f zNC*>tL4gg(7;KQZqKA8AP#xM)5wUzBA)y9AlktjB`YurYGVTnO6{+J_g_0~Y_WF^y z3Fem?W?yBPLM2qNd0Bhypb>DLx8U=miz;F!!~cLj0o(e8R)_=-@o0ocwJ{{}YT%?M ze)0%&(Ewa7dv7;gUdJ&VUpdAI;PMsimre97-umT!3;lwTk^H=;TgJ4Xlh!o8tu`nR zn<$7h7EIN!4Rre%D+)rNfh~ej3g3gT)4pN4L9?~T>ST|hbwc`bSZ^O>Ko+2SX6e*e%_1d&_}m{0PZm$G$NIcML+wy zJ9~*rxeJ9kh>Zu-)(;zMb}jB*6XyQqvV*O$C*vy%tNH4fyeA9P?H&3dn}2Mq5!$AB9rKoG}i=wnrRK7A zhQ(cq*Hl|RP9iJ>U!O8jWc~^{pacs=3F(jf`iE)E-)sjkqhMW|$*Q z#0>Rw9qY6hmiuP678{!S{ocOekRjU~!1ulJ%B~4zDu#`ua@`-c?q_|=6#t`* zo6<}lC$ryhPL&=#hZ-&$5fFKPsPMECS~2r2)d42k-8BWUf1Cy2!zC~fBJtRG-c>)t zlL;$gd&f$PvlF;_;QL>^QBfqL>#7vR6Jn-mt+;HI<3giO;Xb+naVxe_q~aw@ftO57 ziR!*mxcOF8FSn#LxHJu5HtprG=hpFt3+XcSOOcGU*UFO=0lK*$M<9g8P$0ewHDo45 zGBW09WA{WJ%e2Dr{ji(zs6L@8 zDCEb4C}|xI)BOB(v2&>bTvm=?&j3t10)R_>Lc5iC&yZR<2I$ z#vWG+BP`xGoMV@K(FNP@B=PWazE_zq79EUUp5bobeqGSG1a#Er1x$HXqd7M>zOxNZ z+5cw;F7{vV1i7#-Lav$fYF&@Nl)MM>J4v!0jyQX@Zm}Ie)`jJhDfp_E|F!uy zdt}@5f&V%=Jm}S;Vl4VlC-%4{ZB-&~(tRyS@U$lsZXc~42j9+zjdqBTeGX`H3p9&w z#mHdaEIgI=blWTsf(1F)8z120{N@t+kBMoxQN9e`ZKc$lehi5@{H^(ZA0LWS2nD-QJok< zGGAISGpGc&U)M5|a>A<)smqAs#{W zYF5W1yaEix{=P^;pvts+6*uq>O+G17DR^2^BJui3nCQ3iJVV?1@e0Jf#aGjgVg6(* zJPV@l=;$Y)x5SOkt5^r~4e{kBE$``h*}mbNd;{kl$?sABGPAxn|Bkr5I5__$G(r;y znJq%uk@ZV7`)+rR)tv8K4^3Jz{_xaH4c1G%V?j`G7ESg#aggA$5>{xblE5RsKI*pq z!iy&xmi_kj)gyY$t39nqF6zchXeI8-(g^snlL*V1!rdP6r=wY``r5-R`bC~*Blh~= zjp+RFC$F%(o!$xcCZ&+GxTL(2Lfmir82;*y zmM7^=@W0)(+9oCr&-d^@blu+vT6}_8Uz_*AW@yPgpHd;FE6lW<+E5>>rufnk`ueG> zRtY@av9Pf)%h=>C-idPQ52dR^T*51!Nu+N1;XPNHTCa-s4+UhQYyGB*RrEiOj?%x` zNs9b%bysv*ItlRX$F{Hq$S5qzL0NclXGLY{ ztw`of`y)Y^u^kLYmVddYRLx~CxJylbrBk--+bFhv>iKDn)|lNf@l4~}w$%qt?YUt* zoc`E~O!O`}qq8+u#LGyas7oB$5DZ1~TTpj*PklFvJqSoNy6wKCu20)6*&DraYFN&d z$Xu9qzK_a9X_;GOvrpZyF2B=Beu(;GY(ZZ5em9zmHpx;p5ZsV(4}FCCYWZ!#^rdju z8w8GZrZENZNZ_g(lwPh6u__@kB4W2?qSE}kY0)PvQ_Yfy%@oL`2w6!2 zT?UaWsuYOX&%nQ-F!z@o#Q^)tJ2Z8p@T!R{V|9JYZ0t}vTf3*QOj!7^dsX__rt|8( z046Ndpqt!4ur@GqH)v1O#2_v%<;Jh|1F2tW)2F)ou-+^iJ}4G9h#UG%!(D)^vZ0}@ ze7Q|jXG~2Oib6ueahh@qCLcO`GGz(2_SDUUqkI##?07oQ0h!!Hik8Bne z%YN+VjNNu~D(ZVb%TeO5Us`3!U*G+AE5ZX&#l7M#QCeKw2fz3`Df5<@tQ|b-9y`~! z{cbJIy<}BtaeIgL1A@Zu1}w-fw(8&8M9wjE$8VYbh|Py=kEl;|>LS^!#kMB#%jezR zsVJXB(Pezz0Vp@ozkigdTmxmwq-|98EsLA?=@h#t%hN-?O2iTIU1vk5#R5Tm!+GGI zPQ^t@g{m?~K0irUVO>NlW;{MCXBu&WC46AZnpw#|VAb;k2b^mX&6UA&9eE!SD6(u@ z93PiiFp*ZFC(HfdZzp>d!^GqRf!E}!diuqLk@ajo6lWd`F_<2t67j>1DF?fSzsz%8 zoA)+3&#O|%`*{-)oU8)@9hcBfu}-&JTXUAHOu7?$cA)l6$ zqeM{8yQF$G!PNDCJy^LN)Qvcr^Tq;FA3&GsBwp}|B!=-YP@~t3+Yn(Wa&f6#LQ>SvZHj8CZhh}R!5!8WS4+72#>&v! zWl+t{`hqY1ZujO&n5ZO?o~ODx$EEWT)#RrBts>`a)Lc^}G8v`BI#1N<`%dqn{%hlRJ(7YNFTJ9ujWp zoJ|uWD(-%xI&4YyYkao^7asL2QgW^6Oem)WvI5FqUa#TdPzMa*roIP2T#Ph$^Ri&? zwv0XTv=|}+3Q{Yj3LcMKgd4~kST&M#E?AUt&I3qt(2fob#y9B$tw8i_1t{tIjZ&&4 zL<@)&!Cls$c^ZGPb;i?wl2RlMZEqFCN$Fhf19^N|D^dehS3+??!i>$+@B1LUwVQ%_ zsqD^&hhE}enpW!zNfI*$QmZ~;09wEiBi7M-7G65)P%s~A{!4y=tJH4fr&zlR601U) zC`#m>Yu-)!1Y4R`3Od16hxJs(Lz!jq6t5kBxW!DCQY94IVQ@TVSJk>=2Z_HSc30sQ z19u#RWrGFgcenwGfzHi(QQu;o<-`n?u%De0-s>DgbU$X0AWkDy7sPPJ2&|3Mt+vz( zJfr5ev#C18a%Q0%*v5PDW9q~FZyg2g(ep_|vWy+eLK}is5Daca1bJo}#t7n%vk$ty z*2PaJ9v|7kqzc2b8yr3JUG+L`-8Z*)_cApu`kQyZNBZW7y83E={PHTJ;+wEXD^Rcs z<^t}v6!bImG{~3{0Gb>SbSoW0%HG}aIAhO~X?-TCT zae&=kI}P7^WB#iagQci7ZgNV@eo%PwiFjxZ6zAH5F)Ga7in_0Y7SGo-q>@T5r#hDY zsJ=(w5Sm6hF9oc5%>2dUZNKrQ){Iv^;Pd|HBUr-)@=G z+=%PN*Rs`<=6j$eq{dABIy$&@A}e&`V`c31@sO|Lg81G|ujY*x zP5$>JsPdJ&JPYfBgf@mq%=n;KtfcEd>XGor;dPERp*w8W>8%aW@OrUbhHd|UioDRZ zuLt#93o0vDKkq4$DqXX%MAu!w(5dNqI*iTVd<)tHk4Zd>@xtO!&;YXnqpsfxiMC~$ zNA2D_d1(5pz(t&}yKZOt+(>`7hRdHg&R$h_N@sYRaj#oj0P4|4E1hw_;~T#(+u2sm zq8*e?tU1Ml&$KW0>VbP}o#T~PGa{$wW4ef4-53VIcXqRaUZDnYzU#X;gI!*7*iva( z&9*B-x$;~$)I82g3WJ2ukd!lKP@C*0DUbuuq^qL+g?0o=N@kszYnqBjw5*0RBquGR zuoCxx_c#*ElLRgVDl`zq?$O?PN0X7qZP&Ev8#I6y3%b-&GBWz~*m>5lifU!xE>hJU zqW|3L>UP~g-}@hap2sRA))!rJ`)n0l+E1G48Ox@oDXcLix$FKrlsOeI#)uhbb3hY6 zgi6DMJ}a7O8}KLL|NXOWuSJ8L0DxXw|G)7RU#~$-sHV6*l@B5~yoU2GrD*KG)<)-% zhlm4R7BvAxY9C0h-w$lVz!FmGFAwJ+jfcGeIjnG)d0x;`mDY+wonBflkBU(Lhk#n5iw2Go zx2V{7Pq6|@MD)PjrmdE4l`c?YaB^FNk(;v^R0cTRWU^0iu~M{QL1Lqqd-ht7ENa-% zqc6K#AM-TP;C66;L&O-H|FHsTk}I3QrSM>TLrMLq*?21liq7Be!VJ;QVQ)NevFrdZhNNmtBJj3+*tdg9pYtN5Xc)7IqzGs zyXSe&ZS8!x64Pz15b@Q=2Rs4qD}QuJ?Cy+b*b+qz0%Xrc=vr2o+Hp(+L5(ePTvnxmA) z%0EB5T1K4a3IBlz*DFg|nv_az?NXo4^~sF+FGyQDZf5jqNl=EYXOB3BV!So`!w-JO zm5aU&KuQC{5VMm6*>8G&v!&jU)nn&7M|ljg!zK^@r%ow#?=ZMKLgng8D22IwMvMwX zC*yLb`PIJQUitAH>xsx&Jmgb&833M4He&b?K63obZsS#PtX0H(7YaU>uuHNQ}nsXQsS7wvV>vG6!pV3 zwPx4p>fZEA!K{R9E_3lhs}a=+7KbxZ#+G#RC;|8pKwgr zI7RJONu&6>dlc^H-U_CCuml#?p2sP&hMVmYJwrT~0KGspDHb-QP~~N}ghg&$cUK>& zN*qD*Zk15y2|?*hzmR0|kbG&NvYwdMbK1RHP^VAZw!A6H50kkQfm0OuF6{Vu8wt7z z0agJDQ)`!EyLf<;gTtJh{^xLy)$Df=>&0ZvTxz?_=Xod*MgAIW$IFOopGBqB5FPs9 z-d)r@t95Y7?6n!Uy4h=w9_HlyMTRzSs_PK?IvKUOH@PMcxPP4;amhcB_3>^Muk<@m z3sW*USF+rzXrncwUTu}sqy98r+I1x!wSMu{YTpWR_vQyfkNYL0Zsk}464VMu19{~Y zv5D|7wTxO*5BFd4re%H;OHTvE%Cp^wbZMdYmz5cz+y7~{5S~->3;*t;qUMv*SwoCw z26(uWX8JXZRT&Bc<1<)7VK4Vwem1W*)oLFs|K{n>Zy)~Vfld7-SIeLXSC{IYjE=Qc zRd>rgPzd|WI$~}!3RG@(w>%?c2>L#6&tE%8!*;72T=zN73l&a$1ztMQl*HkQ4>dBX zj7O0%kNMfYrvwQSsFn^ySI5-lwn9QMvrM*I5u)rQ89tbSIApBW$k>j#i^+5Ggs1nW z(zN=tDS`g(T*IWM$^BIr45WT7IPyiaUhi(?({Da*ba-|#O}uhmr=Z^zN4#k@*fc5&$2}ZRWR+^lt25bXD*zpxWP;x)@FbVJ z!{7GNaMVZ;rRASU2CsZCHq6C7Ao05-C~}8<&@7%9fkI^^BVMY!wJbBvx-YoeRzmz_ z1tCm;U4OiMc%pm$E2k&w@dQ2a*0--2F>8@KM}D^`f1RKAAEYFH$yyY&2FAH0e80fYD2# zC$$Ank_PK-ZR;{JCkD-n4z?f~c2{Lm1CmITxCtom29sfPyfk8Y$W4iVTZl)KEc$Um z2~IYB9_z__)iX6Noqi9h8yl@Q*7f!H%Eib`hetgmS16j)a5gk}2?xMZ{8LDwa9kJo zIwZOIj=uE`+5yVLI}IUK5lg95Z_YW7n5Pz2DhT{KP_t329@eP-xn-!~6J7SSth^k* zx4%!vhi1Qh7M-r3@9%weBkgSI`sgoa!8z{8O<#_maSVcmr18?C8Gy3rhG6S3(q?;Q z|Jfy7-igeq`2DWWF1v>6mb(c80r&SuzTW6vi!(Zy8){zpk?5zBT+mz3yqv$g z?&Q1^6OUaCU10f-|W+i-t<6^Gbi}pLt92rfm>%7f6|UAelbPFETV?1 z;94#qp5mM_nDrrRR~Jg7Cqj^+ z0b$6%sTk)wHuyk4FR0kU+Vx7>E)@-Cr4I;}NDINliI5cu0$5v~Tg6-QsFGwh?)d6M=Smf7>+?qYQ-X#2IbNlS9Z@~lvojgp^RSeC zgk2N42AfWkbC1y9O1W8O5e-ig0L?bLL=Dy=$bzDS0@#SiFcD*To;cN=0dssDYp4iP~SxJ#>4J%Glc<)+V|GW9NEQxTN6g(6?8IE+ZP+#-4yT~gV+Wm4VpwVRJ zud~Y@m|9OJVGa$P7=R-*fm#xj3n%++^9+SmY%MQ0Y4VHs7v9+A0m8pt1lRj@CCZh1 zEJdh@>>cXV%t$4^vmky=FD?dX8kh0*0?v;tx zu43ebZ21qwhe7#mQPJ8fg*XwNX3LJi_TJo_3rh#7U912mH>nWDn7MbLtjDA8kNHn{ zsj>O%dx|oQaw+S3q(dwDz#`v`*>ui(f7+q9?)|SE@Wj_rs*_z$a;x^m0!k&Njr@z1 zS#dPn5jDUkc>OVD9I`IYvQI;gk$U;&IE; zjS$7*Ms{ z?V-SsFfQH^(Zj*N1Z5Cv<*G&~Zh~o%%rLPDf^H3u&V*x}Syu^^G8gBXiLMD_(X{`% zd(V|McdziMAwF`&T$sYN#VEiSj{3Uw*6sTiWJvEu!nBZi&bnl)ICAUOE=X?38dp$dd!YN3NG_X0zWWvInq(-wu>+ zPfycL43Eo8!b_)~xc#Our`mV^(}UQtPTj@a*r4`&7r2q6uJa07#A87|jdgb6sl@8q z75HLoHk0Oe^VxBu;@FQ!vdDkra$9}o8Gp>IY~ycEn(9tyArvfFQ1r3A*_C!*^F&;$ zei}y_ZeZGBv^O)!NcStY?Kqaow@;X}(Vb3tsLDbuf5K-Bbh~;b$G+7cxMv`?vRyvt zu-g;GmF4;5qg}*#8XOHSrwmwSQ>$_dJNw^Ixz0FJrjna`T|4JPTankERI<&hcVYlw4G>Jw!H1Ib z4o}PyAcBZlDqS#15c$g2?rr+~^4xr4vh67J!};(`-?%kw1kX?ie9oLPBAFB{D_kox zsGR_y&LnBOG1rG5+WGx>_H=KHP6o?njrD-iYp1ew)whKe1HakC)<*2`f^6JpJo) zCzikm<;h1hFvmo4gF5^~5Kb$jhFu_L5vpktlqi~#OkGD3{Kr^L-Xw(=L=ytTt}l5R zpWh??6NR>PnwGI;VB*h4`_FItKTFMhiD5c^4kn8~9Qpg((M(=YJc<)6IA)kqQQT*k zWlC*MIS3Jp+qw#U`$;H6;Tv=dHVoKk>vkHFCu+bk}{tS)Er5OveL!?T}FY z#JDYfe!k+1FGfFh>WO_n?=8Du=}KZ2wMa7RlamGT|NAN>LeG_;vbXj$v$K!@@}s4u Kr&_Fx4*WkMavEj; literal 9664 zcmYj%RZtvEu=T>?y0|+_a0zY+Zo%Dkae}+MySoIppb75o?vUW_?)>@g{U2`ERQIXV zeY$JrWnMZ$QC<=ii4X|@0H8`si75jB(ElJb00HAB%>SlLR{!zO|C9P3zxw_U8?1d8uRZ=({Ga4shyN}3 zAK}WA(ds|``G4jA)9}Bt2Hy0+f3rV1E6b|@?hpGA=PI&r8)ah|)I2s(P5Ic*Ndhn^ z*T&j@gbCTv7+8rpYbR^Ty}1AY)YH;p!m948r#%7x^Z@_-w{pDl|1S4`EM3n_PaXvK z1JF)E3qy$qTj5Xs{jU9k=y%SQ0>8E$;x?p9ayU0bZZeo{5Z@&FKX>}s!0+^>C^D#z z>xsCPvxD3Z=dP}TTOSJhNTPyVt14VCQ9MQFN`rn!c&_p?&4<5_PGm4a;WS&1(!qKE z_H$;dDdiPQ!F_gsN`2>`X}$I=B;={R8%L~`>RyKcS$72ai$!2>d(YkciA^J0@X%G4 z4cu!%Ps~2JuJ8ex`&;Fa0NQOq_nDZ&X;^A=oc1&f#3P1(!5il>6?uK4QpEG8z0Rhu zvBJ+A9RV?z%v?!$=(vcH?*;vRs*+PPbOQ3cdPr5=tOcLqmfx@#hOqX0iN)wTTO21jH<>jpmwRIAGw7`a|sl?9y9zRBh>(_%| zF?h|P7}~RKj?HR+q|4U`CjRmV-$mLW>MScKnNXiv{vD3&2@*u)-6P@h0A`eeZ7}71 zK(w%@R<4lLt`O7fs1E)$5iGb~fPfJ?WxhY7c3Q>T-w#wT&zW522pH-B%r5v#5y^CF zcC30Se|`D2mY$hAlIULL%-PNXgbbpRHgn<&X3N9W!@BUk@9g*P5mz-YnZBb*-$zMM z7Qq}ic0mR8n{^L|=+diODdV}Q!gwr?y+2m=3HWwMq4z)DqYVg0J~^}-%7rMR@S1;9 z7GFj6K}i32X;3*$SmzB&HW{PJ55kT+EI#SsZf}bD7nW^Haf}_gXciYKX{QBxIPSx2Ma? zHQqgzZq!_{&zg{yxqv3xq8YV+`S}F6A>Gtl39_m;K4dA{pP$BW0oIXJ>jEQ!2V3A2 zdpoTxG&V=(?^q?ZTj2ZUpDUdMb)T?E$}CI>r@}PFPWD9@*%V6;4Ag>D#h>!s)=$0R zRXvdkZ%|c}ubej`jl?cS$onl9Tw52rBKT)kgyw~Xy%z62Lr%V6Y=f?2)J|bZJ5(Wx zmji`O;_B+*X@qe-#~`HFP<{8$w@z4@&`q^Q-Zk8JG3>WalhnW1cvnoVw>*R@c&|o8 zZ%w!{Z+MHeZ*OE4v*otkZqz11*s!#s^Gq>+o`8Z5 z^i-qzJLJh9!W-;SmFkR8HEZJWiXk$40i6)7 zZpr=k2lp}SasbM*Nbn3j$sn0;rUI;%EDbi7T1ZI4qL6PNNM2Y%6{LMIKW+FY_yF3) zSKQ2QSujzNMSL2r&bYs`|i2Dnn z=>}c0>a}>|uT!IiMOA~pVT~R@bGlm}Edf}Kq0?*Af6#mW9f9!}RjW7om0c9Qlp;yK z)=XQs(|6GCadQbWIhYF=rf{Y)sj%^Id-ARO0=O^Ad;Ph+ z0?$eE1xhH?{T$QI>0JP75`r)U_$#%K1^BQ8z#uciKf(C701&RyLQWBUp*Q7eyn76} z6JHpC9}R$J#(R0cDCkXoFSp;j6{x{b&0yE@P7{;pCEpKjS(+1RQy38`=&Yxo%F=3y zCPeefABp34U-s?WmU#JJw23dcC{sPPFc2#J$ZgEN%zod}J~8dLm*fx9f6SpO zn^Ww3bt9-r0XaT2a@Wpw;C23XM}7_14#%QpubrIw5aZtP+CqIFmsG4`Cm6rfxl9n5 z7=r2C-+lM2AB9X0T_`?EW&Byv&K?HS4QLoylJ|OAF z`8atBNTzJ&AQ!>sOo$?^0xj~D(;kS$`9zbEGd>f6r`NC3X`tX)sWgWUUOQ7w=$TO&*j;=u%25ay-%>3@81tGe^_z*C7pb9y*Ed^H3t$BIKH2o+olp#$q;)_ zfpjCb_^VFg5fU~K)nf*d*r@BCC>UZ!0&b?AGk_jTPXaSnCuW110wjHPPe^9R^;jo3 zwvzTl)C`Zl5}O2}3lec=hZ*$JnkW#7enKKc)(pM${_$9Hc=Sr_A9Biwe*Y=T?~1CK z6eZ9uPICjy-sMGbZl$yQmpB&`ouS8v{58__t0$JP%i3R&%QR3ianbZqDs<2#5FdN@n5bCn^ZtH992~5k(eA|8|@G9u`wdn7bnpg|@{m z^d6Y`*$Zf2Xr&|g%sai#5}Syvv(>Jnx&EM7-|Jr7!M~zdAyjt*xl;OLhvW-a%H1m0 z*x5*nb=R5u><7lyVpNAR?q@1U59 zO+)QWwL8t zyip?u_nI+K$uh{y)~}qj?(w0&=SE^8`_WMM zTybjG=999h38Yes7}-4*LJ7H)UE8{mE(6;8voE+TYY%33A>S6`G_95^5QHNTo_;Ao ztIQIZ_}49%{8|=O;isBZ?=7kfdF8_@azfoTd+hEJKWE!)$)N%HIe2cplaK`ry#=pV z0q{9w-`i0h@!R8K3GC{ivt{70IWG`EP|(1g7i_Q<>aEAT{5(yD z=!O?kq61VegV+st@XCw475j6vS)_z@efuqQgHQR1T4;|-#OLZNQJPV4k$AX1Uk8Lm z{N*b*ia=I+MB}kWpupJ~>!C@xEN#Wa7V+7{m4j8c?)ChV=D?o~sjT?0C_AQ7B-vxqX30s0I_`2$in86#`mAsT-w?j{&AL@B3$;P z31G4(lV|b}uSDCIrjk+M1R!X7s4Aabn<)zpgT}#gE|mIvV38^ODy@<&yflpCwS#fRf9ZX3lPV_?8@C5)A;T zqmouFLFk;qIs4rA=hh=GL~sCFsXHsqO6_y~*AFt939UYVBSx1s(=Kb&5;j7cSowdE;7()CC2|-i9Zz+_BIw8#ll~-tyH?F3{%`QCsYa*b#s*9iCc`1P1oC26?`g<9))EJ3%xz+O!B3 zZ7$j~To)C@PquR>a1+Dh>-a%IvH_Y7^ys|4o?E%3`I&ADXfC8++hAdZfzIT#%C+Jz z1lU~K_vAm0m8Qk}K$F>|>RPK%<1SI0(G+8q~H zAsjezyP+u!Se4q3GW)`h`NPSRlMoBjCzNPesWJwVTY!o@G8=(6I%4XHGaSiS3MEBK zhgGFv6Jc>L$4jVE!I?TQuwvz_%CyO!bLh94nqK11C2W$*aa2ueGopG8DnBICVUORP zgytv#)49fVXDaR$SukloYC3u7#5H)}1K21=?DKj^U)8G;MS)&Op)g^zR2($<>C*zW z;X7`hLxiIO#J`ANdyAOJle4V%ppa*(+0i3w;8i*BA_;u8gOO6)MY`ueq7stBMJTB; z-a0R>hT*}>z|Gg}@^zDL1MrH+2hsR8 zHc}*9IvuQC^Ju)^#Y{fOr(96rQNPNhxc;mH@W*m206>Lo<*SaaH?~8zg&f&%YiOEG zGiz?*CP>Bci}!WiS=zj#K5I}>DtpregpP_tfZtPa(N<%vo^#WCQ5BTv0vr%Z{)0q+ z)RbfHktUm|lg&U3YM%lMUM(fu}i#kjX9h>GYctkx9Mt_8{@s%!K_EI zScgwy6%_fR?CGJQtmgNAj^h9B#zmaMDWgH55pGuY1Gv7D z;8Psm(vEPiwn#MgJYu4Ty9D|h!?Rj0ddE|&L3S{IP%H4^N!m`60ZwZw^;eg4sk6K{ ziA^`Sbl_4~f&Oo%n;8Ye(tiAdlZKI!Z=|j$5hS|D$bDJ}p{gh$KN&JZYLUjv4h{NY zBJ>X9z!xfDGY z+oh_Z&_e#Q(-}>ssZfm=j$D&4W4FNy&-kAO1~#3Im;F)Nwe{(*75(p=P^VI?X0GFakfh+X-px4a%Uw@fSbmp9hM1_~R>?Z8+ ziy|e9>8V*`OP}4x5JjdWp}7eX;lVxp5qS}0YZek;SNmm7tEeSF*-dI)6U-A%m6YvCgM(}_=k#a6o^%-K4{`B1+}O4x zztDT%hVb;v#?j`lTvlFQ3aV#zkX=7;YFLS$uIzb0E3lozs5`Xy zi~vF+%{z9uLjKvKPhP%x5f~7-Gj+%5N`%^=yk*Qn{`> z;xj&ROY6g`iy2a@{O)V(jk&8#hHACVDXey5a+KDod_Z&}kHM}xt7}Md@pil{2x7E~ zL$k^d2@Ec2XskjrN+IILw;#7((abu;OJii&v3?60x>d_Ma(onIPtcVnX@ELF0aL?T zSmWiL3(dOFkt!x=1O!_0n(cAzZW+3nHJ{2S>tgSK?~cFha^y(l@-Mr2W$%MN{#af8J;V*>hdq!gx=d0h$T7l}>91Wh07)9CTX zh2_ZdQCyFOQ)l(}gft0UZG`Sh2`x-w`5vC2UD}lZs*5 zG76$akzn}Xi))L3oGJ75#pcN=cX3!=57$Ha=hQ2^lwdyU#a}4JJOz6ddR%zae%#4& za)bFj)z=YQela(F#Y|Q#dp}PJghITwXouVaMq$BM?K%cXn9^Y@g43$=O)F&ZlOUom zJiad#dea;-eywBA@e&D6Pdso1?2^(pXiN91?jvcaUyYoKUmvl5G9e$W!okWe*@a<^ z8cQQ6cNSf+UPDx%?_G4aIiybZHHagF{;IcD(dPO!#=u zWfqLcPc^+7Uu#l(Bpxft{*4lv#*u7X9AOzDO z1D9?^jIo}?%iz(_dwLa{ex#T}76ZfN_Z-hwpus9y+4xaUu9cX}&P{XrZVWE{1^0yw zO;YhLEW!pJcbCt3L8~a7>jsaN{V3>tz6_7`&pi%GxZ=V3?3K^U+*ryLSb)8^IblJ0 zSRLNDvIxt)S}g30?s_3NX>F?NKIGrG_zB9@Z>uSW3k2es_H2kU;Rnn%j5qP)!XHKE zPB2mHP~tLCg4K_vH$xv`HbRsJwbZMUV(t=ez;Ec(vyHH)FbfLg`c61I$W_uBB>i^r z&{_P;369-&>23R%qNIULe=1~T$(DA`ev*EWZ6j(B$(te}x1WvmIll21zvygkS%vwG zzkR6Z#RKA2!z!C%M!O>!=Gr0(J0FP=-MN=5t-Ir)of50y10W}j`GtRCsXBakrKtG& zazmITDJMA0C51&BnLY)SY9r)NVTMs);1<=oosS9g31l{4ztjD3#+2H7u_|66b|_*O z;Qk6nalpqdHOjx|K&vUS_6ITgGll;TdaN*ta=M_YtyC)I9Tmr~VaPrH2qb6sd~=AcIxV+%z{E&0@y=DPArw zdV7z(G1hBx7hd{>(cr43^WF%4Y@PXZ?wPpj{OQ#tvc$pABJbvPGvdR`cAtHn)cSEV zrpu}1tJwQ3y!mSmH*uz*x0o|CS<^w%&KJzsj~DU0cLQUxk5B!hWE>aBkjJle8z~;s z-!A=($+}Jq_BTK5^B!`R>!MulZN)F=iXXeUd0w5lUsE5VP*H*oCy(;?S$p*TVvTxwAeWFB$jHyb0593)$zqalVlDX=GcCN1gU0 zlgU)I$LcXZ8Oyc2TZYTPu@-;7<4YYB-``Qa;IDcvydIA$%kHhJKV^m*-zxcvU4viy&Kr5GVM{IT>WRywKQ9;>SEiQD*NqplK-KK4YR`p0@JW)n_{TU3bt0 zim%;(m1=#v2}zTps=?fU5w^(*y)xT%1vtQH&}50ZF!9YxW=&7*W($2kgKyz1mUgfs zfV<*XVVIFnohW=|j+@Kfo!#liQR^x>2yQdrG;2o8WZR+XzU_nG=Ed2rK?ntA;K5B{ z>M8+*A4!Jm^Bg}aW?R?6;@QG@uQ8&oJ{hFixcfEnJ4QH?A4>P=q29oDGW;L;= z9-a0;g%c`C+Ai!UmK$NC*4#;Jp<1=TioL=t^YM)<<%u#hnnfSS`nq63QKGO1L8RzX z@MFDqs1z ztYmxDl@LU)5acvHk)~Z`RW7=aJ_nGD!mOSYD>5Odjn@TK#LY{jf?+piB5AM-CAoT_ z?S-*q7}wyLJzK>N%eMPuFgN)Q_otKP;aqy=D5f!7<=n(lNkYRXVpkB{TAYLYg{|(jtRqYmg$xH zjmq?B(RE4 zQx^~Pt}gxC2~l=K$$-sYy_r$CO(d=+b3H1MB*y_5g6WLaWTXn+TKQ|hNY^>Mp6k*$ zwkovomhu776vQATqT4blf~g;TY(MWCrf^^yfWJvSAB$p5l;jm@o#=!lqw+Lqfq>X= z$6~kxfm7`3q4zUEB;u4qa#BdJxO!;xGm)wwuisj{0y2x{R(IGMrsIzDY9LW>m!Y`= z04sx3IjnYvL<4JqxQ8f7qYd0s2Ig%`ytYPEMKI)s(LD}D@EY>x`VFtqvnADNBdeao zC96X+MxnwKmjpg{U&gP3HE}1=s!lv&D{6(g_lzyF3A`7Jn*&d_kL<;dAFx!UZ>hB8 z5A*%LsAn;VLp>3${0>M?PSQ)9s3}|h2e?TG4_F{}{Cs>#3Q*t$(CUc}M)I}8cPF6% z=+h(Kh^8)}gj(0}#e7O^FQ6`~fd1#8#!}LMuo3A0bN`o}PYsm!Y}sdOz$+Tegc=qT z8x`PH$7lvnhJp{kHWb22l;@7B7|4yL4UOOVM0MP_>P%S1Lnid)+k9{+3D+JFa#Pyf zhVc#&df87APl4W9X)F3pGS>@etfl=_E5tBcVoOfrD4hmVeTY-cj((pkn%n@EgN{0f zwb_^Rk0I#iZuHK!l*lN`ceJn(sI{$Fq6nN& zE<-=0_2WN}m+*ivmIOxB@#~Q-cZ>l136w{#TIJe478`KE7@=a{>SzPHsKLzYAyBQO zAtuuF$-JSDy_S@6GW0MOE~R)b;+0f%_NMrW(+V#c_d&U8Z9+ec4=HmOHw?gdjF(Lu zzra83M_BoO-1b3;9`%&DHfuUY)6YDV21P$C!Rc?mv&{lx#f8oc6?0?x zK08{WP65?#>(vPfA-c=MCY|%*1_<3D4NX zeVTi-JGl2uP_2@0F{G({pxQOXt_d{g_CV6b?jNpfUG9;8yle-^4KHRvZs-_2siata zt+d_T@U$&t*xaD22(fH(W1r$Mo?3dc%Tncm=C6{V9y{v&VT#^1L04vDrLM9qBoZ4@ z6DBN#m57hX7$C(=#$Y5$bJmwA$T8jKD8+6A!-IJwA{WOfs%s}yxUw^?MRZjF$n_KN z6`_bGXcmE#5e4Ym)aQJ)xg3Pg0@k`iGuHe?f(5LtuzSq=nS^5z>vqU0EuZ&75V%Z{ zYyhRLN^)$c6Ds{f7*FBpE;n5iglx5PkHfWrj3`x^j^t z7ntuV`g!9Xg#^3!x)l*}IW=(Tz3>Y5l4uGaB&lz{GDjm2D5S$CExLT`I1#n^lBH7Y zDgpMag@`iETKAI=p<5E#LTkwzVR@=yY|uBVI1HG|8h+d;G-qfuj}-ZR6fN>EfCCW z9~wRQoAPEa#aO?3h?x{YvV*d+NtPkf&4V0k4|L=uj!U{L+oLa(z#&iuhJr3-PjO3R z5s?=nn_5^*^Rawr>>Nr@K(jwkB#JK-=+HqwfdO<+P5byeim)wvqGlP-P|~Nse8=XF zz`?RYB|D6SwS}C+YQv+;}k6$-%D(@+t14BL@vM z2q%q?f6D-A5s$_WY3{^G0F131bbh|g!}#BKw=HQ7mx;Dzg4Z*bTLQSfo{ed{4}NZW zfrRm^Ca$rlE{Ue~uYv>R9{3smwATcdM_6+yWIO z*ZRH~uXE@#p$XTbCt5j7j2=86e{9>HIB6xDzV+vAo&B?KUiMP|ttOElepnl%|DPqL b{|{}U^kRn2wo}j7|0ATu<;8xA7zX}7|B6mN From 10deb37c464dcbf01b65535fa18c77cd99326a1f Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Tue, 7 Nov 2023 11:32:09 +0100 Subject: [PATCH 002/446] Rework dockerfile layer to have a ligther image --- .dockerignore | 9 ++++-- development/Dockerfile | 70 ++++++++++++++++++++++-------------------- 2 files changed, 44 insertions(+), 35 deletions(-) diff --git a/.dockerignore b/.dockerignore index 0f39cf793b..e6b13d4587 100644 --- a/.dockerignore +++ b/.dockerignore @@ -1,4 +1,4 @@ -*.pyc +**/*.pyc *.env script.py node_modules @@ -7,7 +7,12 @@ node_modules .mypy_cache .pytest_cache *.env - +.DS_Store +.coverage +coverage.xml +.git* +.devcontainer +.husky frontend/node_modules # Direnv files (https://direnv.net/) diff --git a/development/Dockerfile b/development/Dockerfile index 618f470f4e..bb17fa11ef 100644 --- a/development/Dockerfile +++ b/development/Dockerfile @@ -11,13 +11,13 @@ RUN mkdir /remote RUN apt-get update && \ apt-get upgrade -y && \ - apt-get install --no-install-recommends -y pkg-config build-essential && \ - apt-get autoremove -y && \ - apt-get clean all && \ + apt-get install --no-install-recommends -y curl git pkg-config build-essential ca-certificates && \ + curl -sSL https://install.python-poetry.org | python3 - && \ rm -rf /var/lib/apt/lists/* && \ - pip --no-cache-dir install --upgrade pip wheel + rm -rf /var/lib/apt/lists/* && \ + pip --no-cache-dir install --no-compile --upgrade pip wheel -RUN curl -sSL https://install.python-poetry.org | python3 - +# RUN curl -sSL https://install.python-poetry.org | python3 - ENV PATH="${PATH}:/root/.local/bin" RUN poetry config virtualenvs.create false @@ -36,7 +36,7 @@ WORKDIR /source RUN npm install --omit=dev COPY frontend/ /source/ -RUN npm run build +RUN npm run build && npm cache clean --force # **************************************************************** # STAGE : Backend @@ -46,22 +46,19 @@ FROM base AS backend # -------------------------------------------- # Configure Git & Environment # -------------------------------------------- -RUN git config --global user.name "Infrahub" -RUN git config --global user.email "infrahub@opsmill.com" -RUN git config --global --add safe.directory '*' -RUN git config --global credential.usehttppath true -RUN git config --global credential.helper /usr/local/bin/infrahub-git-credential +RUN git config --global user.name "Infrahub" && \ + git config --global user.email "infrahub@opsmill.com" && \ + git config --global --add safe.directory '*' && \ + git config --global credential.usehttppath true && \ + git config --global credential.helper /usr/local/bin/infrahub-git-credential -RUN mkdir -p /opt/infrahub/git -RUN mkdir -p /opt/infrahub/storage -RUN mkdir -p /opt/infrahub/source +RUN mkdir -p /opt/infrahub/git /opt/infrahub/storage /opt/infrahub/source /opt/infrahub/frontend/dist WORKDIR /source # -------------------------------------------- # Import Frontend Build # -------------------------------------------- -RUN mkdir -p /opt/infrahub/frontend/dist COPY --from=frontend /source/dist/ /opt/infrahub/frontend/dist # -------------------------------------------- @@ -69,31 +66,38 @@ COPY --from=frontend /source/dist/ /opt/infrahub/frontend/dist # Copy in only pyproject.toml/poetry.lock to help with caching this layer if no updates to dependencies # -------------------------------------------- COPY poetry.lock pyproject.toml /source/ -RUN poetry install --no-interaction --no-ansi --no-root --no-directory +RUN poetry install --no-interaction --no-ansi --no-root --no-directory --no-dev # -------------------------------------------- # Copy in the rest of the source code and install the project # -------------------------------------------- -COPY . /source -RUN poetry install --no-interaction --no-ansi - -# **************************************************************** -# STAGE : Gitpod -# **************************************************************** - -FROM backend as gitpod +COPY . ./ +RUN poetry install --no-interaction --no-ansi --with server --with sync --with nornir # -------------------------------------------- -# Create new user and assign the right permissions +# Purge & Cleanup # -------------------------------------------- -ARG USER_ID=33333 -ARG GROUP_ID=33333 +RUN apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false pkg-config build-essential && \ + apt-get clean && \ + rm -rf /var/lib/apt/lists/* + +# # **************************************************************** +# # STAGE : Gitpod +# # **************************************************************** + +# FROM backend as gitpod + +# # -------------------------------------------- +# # Create new user and assign the right permissions +# # -------------------------------------------- +# ARG USER_ID=33333 +# ARG GROUP_ID=33333 -RUN addgroup --gid ${GROUP_ID} user -RUN adduser --disabled-password --gecos '' --uid ${USER_ID} --gid ${GROUP_ID} user +# RUN addgroup --gid ${GROUP_ID} user +# RUN adduser --disabled-password --gecos '' --uid ${USER_ID} --gid ${GROUP_ID} user -RUN chown -R ${USER_ID}:${GROUP_ID} /prom_shared -RUN chown -R ${USER_ID}:${GROUP_ID} /opt -RUN chown -R ${USER_ID}:${GROUP_ID} /remote +# RUN chown -R ${USER_ID}:${GROUP_ID} /prom_shared +# RUN chown -R ${USER_ID}:${GROUP_ID} /opt +# RUN chown -R ${USER_ID}:${GROUP_ID} /remote -USER user +# USER user From 4b8706927be469ed3e273c45cf825c4ecf891f12 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Wed, 8 Nov 2023 15:13:11 +0100 Subject: [PATCH 003/446] put back gitpod --- development/Dockerfile | 33 ++++++++++++++++----------------- 1 file changed, 16 insertions(+), 17 deletions(-) diff --git a/development/Dockerfile b/development/Dockerfile index bb17fa11ef..8fa0ba08f6 100644 --- a/development/Dockerfile +++ b/development/Dockerfile @@ -17,7 +17,6 @@ RUN apt-get update && \ rm -rf /var/lib/apt/lists/* && \ pip --no-cache-dir install --no-compile --upgrade pip wheel -# RUN curl -sSL https://install.python-poetry.org | python3 - ENV PATH="${PATH}:/root/.local/bin" RUN poetry config virtualenvs.create false @@ -72,7 +71,7 @@ RUN poetry install --no-interaction --no-ansi --no-root --no-directory --no-dev # Copy in the rest of the source code and install the project # -------------------------------------------- COPY . ./ -RUN poetry install --no-interaction --no-ansi --with server --with sync --with nornir +RUN poetry install --no-interaction --no-ansi # -------------------------------------------- # Purge & Cleanup @@ -81,23 +80,23 @@ RUN apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false apt-get clean && \ rm -rf /var/lib/apt/lists/* -# # **************************************************************** -# # STAGE : Gitpod -# # **************************************************************** +# **************************************************************** +# STAGE : Gitpod +# **************************************************************** -# FROM backend as gitpod +FROM backend as gitpod -# # -------------------------------------------- -# # Create new user and assign the right permissions -# # -------------------------------------------- -# ARG USER_ID=33333 -# ARG GROUP_ID=33333 +# -------------------------------------------- +# Create new user and assign the right permissions +# -------------------------------------------- +ARG USER_ID=33333 +ARG GROUP_ID=33333 -# RUN addgroup --gid ${GROUP_ID} user -# RUN adduser --disabled-password --gecos '' --uid ${USER_ID} --gid ${GROUP_ID} user +RUN addgroup --gid ${GROUP_ID} user +RUN adduser --disabled-password --gecos '' --uid ${USER_ID} --gid ${GROUP_ID} user -# RUN chown -R ${USER_ID}:${GROUP_ID} /prom_shared -# RUN chown -R ${USER_ID}:${GROUP_ID} /opt -# RUN chown -R ${USER_ID}:${GROUP_ID} /remote +RUN chown -R ${USER_ID}:${GROUP_ID} /prom_shared +RUN chown -R ${USER_ID}:${GROUP_ID} /opt +RUN chown -R ${USER_ID}:${GROUP_ID} /remote -# USER user +USER user From 4510960c660c2868a23d4862017f4461cc9f0511 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Wed, 8 Nov 2023 16:11:44 +0100 Subject: [PATCH 004/446] taking Damien comments in consideration --- development/Dockerfile | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/development/Dockerfile b/development/Dockerfile index 8fa0ba08f6..5b42d660d9 100644 --- a/development/Dockerfile +++ b/development/Dockerfile @@ -65,7 +65,7 @@ COPY --from=frontend /source/dist/ /opt/infrahub/frontend/dist # Copy in only pyproject.toml/poetry.lock to help with caching this layer if no updates to dependencies # -------------------------------------------- COPY poetry.lock pyproject.toml /source/ -RUN poetry install --no-interaction --no-ansi --no-root --no-directory --no-dev +RUN poetry install --no-interaction --no-ansi --no-root --no-directory # -------------------------------------------- # Copy in the rest of the source code and install the project @@ -76,8 +76,8 @@ RUN poetry install --no-interaction --no-ansi # -------------------------------------------- # Purge & Cleanup # -------------------------------------------- -RUN apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false pkg-config build-essential && \ - apt-get clean && \ +RUN apt-get autoremove -y && \ + apt-get clean all && \ rm -rf /var/lib/apt/lists/* # **************************************************************** From 7bf46c5e4bd1395c770481268e39c408dff6748b Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Sat, 11 Nov 2023 17:17:37 -0700 Subject: [PATCH 005/446] Remove import from Infrahub in infrahubctl --- python_sdk/infrahub_ctl/check.py | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/python_sdk/infrahub_ctl/check.py b/python_sdk/infrahub_ctl/check.py index 41139b68e0..ef779cb2b5 100644 --- a/python_sdk/infrahub_ctl/check.py +++ b/python_sdk/infrahub_ctl/check.py @@ -6,12 +6,14 @@ from typing import Optional import typer -from infrahub.checks import INFRAHUB_CHECK_VARIABLE_TO_IMPORT from rich.logging import RichHandler app = typer.Typer() +INFRAHUB_CHECK_VARIABLE_TO_IMPORT = "INFRAHUB_CHECKS" + + # pylint: disable=too-many-nested-blocks,too-many-branches From 8ba64a2bc82dfa42855874d2b4b04e132bbb5318 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Sat, 11 Nov 2023 17:17:45 -0700 Subject: [PATCH 006/446] Increase SDK version to 0.2.1 --- python_sdk/pyproject.toml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/python_sdk/pyproject.toml b/python_sdk/pyproject.toml index b021e21d35..6d9cd70676 100644 --- a/python_sdk/pyproject.toml +++ b/python_sdk/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "infrahub-sdk" -version = "0.2.0" +version = "0.2.1" description = "Python Client to interact with Infrahub" authors = ["OpsMill "] readme = "README.md" From bc344a3fdfdbef89bba77c0e2445e8ee9ae2c0f6 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 11:04:09 +0100 Subject: [PATCH 007/446] Redo ruff branches after fail rebase --- .github/workflows/ci.yml | 26 +++++------- pyproject.toml | 74 ++++++++++++++++++++++------------ tasks/backend.py | 65 +++++------------------------- tasks/ctl.py | 65 +++++------------------------- tasks/demo.py | 11 ++++- tasks/main.py | 23 +++-------- tasks/sdk.py | 86 +++++++++++----------------------------- tasks/shared.py | 8 +++- tasks/sync.py | 69 +++++--------------------------- 9 files changed, 134 insertions(+), 293 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 02621d8b3b..a5f507518c 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -28,6 +28,7 @@ env: BUILDKITE_COMMIT: ${{ github.sha }} jobs: + # ------------------------------------------ Check Files Changes ------------------------------------------ files-changed: name: Detect which file has changed runs-on: ubuntu-20.04 @@ -51,6 +52,7 @@ jobs: token: ${{ github.token }} filters: .github/file-filters.yml + # ------------------------------------------ All Linter ------------------------------------------ yaml-lint: if: needs.files-changed.outputs.yaml == 'true' needs: ["files-changed"] @@ -72,6 +74,7 @@ jobs: run: "pip install yamllint==1.29.0" - name: "Linting: yamllint" run: "yamllint ." + javascript-lint: if: needs.files-changed.outputs.javascript == 'true' needs: ["files-changed"] @@ -92,6 +95,7 @@ jobs: - name: Run ESLint working-directory: ./frontend run: npm run eslint + python-lint: if: needs.files-changed.outputs.python == 'true' needs: ["files-changed"] @@ -101,12 +105,11 @@ jobs: - name: "Check out repository code" uses: "actions/checkout@v3" - name: "Setup environment" - run: "pip install black==23.1.0 ruff==0.0.265" - - name: "Linting: BLACK" - run: "black --check ." + run: "pip install ruff==0.1.5" - name: "Linting: ruff" - run: "ruff check ." + run: "ruff check . --fix" + # ------------------------------------------ Build Docker Image ------------------------------------------ # backend-build-docker: # runs-on: ubuntu-latest # steps: @@ -123,6 +126,7 @@ jobs: # cache-from: type=gha # cache-to: type=gha,mode=max + # ------------------------------------------ Tests by Component ------------------------------------------ python-sdk-tests: if: | always() && !cancelled() && @@ -143,10 +147,6 @@ jobs: run: "invoke test.build" - name: "Pull External Docker Images" run: "invoke test.pull" - - name: "Black Tests" - run: "invoke sdk.black --docker" - - name: "Isort Tests" - run: "invoke sdk.isort --docker" - name: "Pylint Tests" run: "invoke sdk.pylint --docker" - name: "Mypy Tests" @@ -190,10 +190,6 @@ jobs: run: "pip install toml invoke" - name: "Build Test Image" run: "invoke test.build" - - name: "Black Tests" - run: "invoke sync.black --docker" - - name: "Isort Tests" - run: "invoke sync.isort --docker" - name: "Pylint Tests" run: "invoke sync.pylint --docker" @@ -217,10 +213,6 @@ jobs: run: "invoke test.build" - name: "Pull External Docker Images" run: "invoke test.pull" - - name: "Black Tests" - run: "invoke backend.black --docker" - - name: "Isort Tests" - run: "invoke backend.isort --docker" - name: "Pylint Tests" run: "invoke backend.pylint --docker" - name: "Mypy Tests" @@ -321,6 +313,7 @@ jobs: parallel: true file: frontend/coverage/lcov.info + # ------------------------------------------ E2E Tests ------------------------------------------ # E2E-testing-memgraph: # needs: ["frontend-tests", "backend-tests-default", "python-sdk-tests"] # if: | @@ -449,6 +442,7 @@ jobs: if: failure() run: invoke demo.status + # ------------------------------------------ Coverall Report ------------------------------------------ coverall-report: needs: ["frontend-tests", "backend-tests-default", "python-sdk-tests"] if: | diff --git a/pyproject.toml b/pyproject.toml index 107a1f9d80..f6dcfc55b5 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -39,7 +39,6 @@ opentelemetry-exporter-otlp-proto-grpc = "^1.16.0" locust = "^2.13.1" [tool.poetry.group.dev.dependencies] -black = "*" pytest = "*" yamllint = "*" pylint = "*" @@ -48,7 +47,6 @@ ipython = "*" pytest-asyncio = "*" requests = "*" pre-commit = "^2.20.0" -isort = "*" autoflake = "*" pytest-clarity = "^1.0.1" pytest-httpx = "^0.22" @@ -57,7 +55,7 @@ types-ujson = "*" types-pyyaml = "*" typer-cli = "*" pytest-cov = "^4.0.0" -ruff = "0.1.0" +ruff = "^0.1.5" pytest-xdist = "^3.3.1" buildkite-test-collector = "^0.1.7" types-python-slugify = "^8.0.0.3" @@ -71,31 +69,13 @@ pynetbox = "^7.0.1" pynautobot = "^1.5.0" diffsync = "^1.8.0" + [tool.poetry.scripts] infrahub = "infrahub.cli:app" infrahub-git-credential = "infrahub.git_credential.helper:app" infrahub-git-askpass = "infrahub.git_credential.askpass:app" infrahub-sync = "infrahub_sync.cli:app" -[tool.black] -line-length = 120 -include = '\.pyi?$' -exclude = ''' - /( - \.git - | \.tox - | \.venv - | env/ - | _build - | build - | dist - | examples - )/ - ''' - -[tool.isort] -profile = "black" -known_first_party = [ "infrahub" ] [tool.coverage.run] branch = true @@ -262,21 +242,65 @@ disallow_untyped_defs = false [tool.ruff] +line-length = 120 + +exclude = [ + ".git", + ".tox", + ".venv", + "env", + "_build", + "build", + "dist", + "examples", +] + +task-tags = [ + "FIXME", + "TODO", + "XXX", +] + +[tool.ruff.lint] +preview = true select = [ + # mccabe complexity "C90", - "DTZ", + # pycodestyle errors "E", + # pycodestyle warnings + "W", + # pyflakes "F", + # isort-like checks + "I", + # flake8-datetimez + "DTZ", + # flake8-import-conventions "ICN", + # flake8-type-checking "TCH", + # flake8-debugger "T10", + # flake8-quotes "Q", - "W", + # flake8-2020 "YTT", ] -line-length = 170 +#https://docs.astral.sh/ruff/formatter/black/ +[tool.ruff.format] +quote-style = "double" +indent-style = "space" +skip-magic-trailing-comma = false +line-ending = "auto" + +[tool.ruff.lint.isort] +known-first-party = ["infrahub"] + +[tool.ruff.lint.pycodestyle] +max-line-length = 150 [tool.ruff.mccabe] # Target max-complexity=10 diff --git a/tasks/backend.py b/tasks/backend.py index c2ff3752e9..3b11788112 100644 --- a/tasks/backend.py +++ b/tasks/backend.py @@ -9,7 +9,7 @@ execute_command, get_env_vars, ) -from .utils import ESCAPED_REPO_PATH +from .utils import ESCAPED_REPO_PATH, REPO_BASE MAIN_DIRECTORY = "backend" NAMESPACE = "BACKEND" @@ -39,11 +39,11 @@ def generate_doc(context: Context): # Formatting tasks # ---------------------------------------------------------------------------- @task -def format_black(context: Context): - """Run black to format all Python files.""" +def format_ruff(context: Context): + """Run ruff to format all Python files.""" - print(f" - [{NAMESPACE}] Format code with black") - exec_cmd = f"black {MAIN_DIRECTORY}/" + print(f" - [{NAMESPACE}] Format code with ruff") + exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) @@ -58,23 +58,12 @@ def format_autoflake(context: Context): context.run(exec_cmd) -@task -def format_isort(context: Context): - """Run isort to format all Python files.""" - - print(f" - [{NAMESPACE}] Format code with isort") - exec_cmd = f"isort {MAIN_DIRECTORY}" - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - @task(name="format") def format_all(context: Context): """This will run all formatter.""" - format_isort(context) format_autoflake(context) - format_black(context) + format_ruff(context) print(f" - [{NAMESPACE}] All formatters have been executed!") @@ -83,11 +72,11 @@ def format_all(context: Context): # Testing tasks # ---------------------------------------------------------------------------- @task -def black(context: Context, docker: bool = False): - """Run black to check that Python files adherence to black standards.""" +def ruff(context: Context, docker: bool = False): + """Run ruff to check that Python files adherence to black standards.""" - print(f" - [{NAMESPACE}] Check code with black") - exec_cmd = f"black --check --diff {MAIN_DIRECTORY}" + print(f" - [{NAMESPACE}] Check code with ruff") + exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix" if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) @@ -98,22 +87,6 @@ def black(context: Context, docker: bool = False): context.run(exec_cmd) -@task -def isort(context: Context, docker: bool = False): - """Run isort to check that Python files adherence to import standards.""" - - print(f" - [{NAMESPACE}] Check code with isort") - exec_cmd = f"isort --check --diff {MAIN_DIRECTORY}" - - if docker: - compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run {build_test_envs()} infrahub-test {exec_cmd}" - print(exec_cmd) - - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - @task def mypy(context: Context, docker: bool = False): """This will run mypy for the specified name and Python version.""" @@ -146,28 +119,10 @@ def pylint(context: Context, docker: bool = False): context.run(exec_cmd) -@task -def ruff(context: Context, docker: bool = False): - """This will run ruff.""" - - print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = f"ruff check {MAIN_DIRECTORY}" - - if docker: - compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run {build_test_envs()} infrahub-test {exec_cmd}" - print(exec_cmd) - - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - @task def lint(context: Context, docker: bool = False): """This will run all linter.""" ruff(context, docker=docker) - black(context, docker=docker) - isort(context, docker=docker) pylint(context, docker=docker) mypy(context, docker=docker) diff --git a/tasks/ctl.py b/tasks/ctl.py index d6e61fb7f4..b4c0301bb5 100644 --- a/tasks/ctl.py +++ b/tasks/ctl.py @@ -8,7 +8,7 @@ execute_command, get_env_vars, ) -from .utils import ESCAPED_REPO_PATH +from .utils import ESCAPED_REPO_PATH, REPO_BASE MAIN_DIRECTORY = "ctl" NAMESPACE = "CTL" @@ -38,11 +38,11 @@ def generate_doc(context: Context): # Formatting tasks # ---------------------------------------------------------------------------- @task -def format_black(context: Context): - """Run black to format all Python files.""" +def format_ruff(context: Context): + """Run ruff to format all Python files.""" - print(f" - [{NAMESPACE}] Format code with black") - exec_cmd = f"black {MAIN_DIRECTORY}/" + print(f" - [{NAMESPACE}] Format code with ruff") + exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) @@ -57,23 +57,12 @@ def format_autoflake(context: Context): context.run(exec_cmd) -@task -def format_isort(context: Context): - """Run isort to format all Python files.""" - - print(f" - [{NAMESPACE}] Format code with isort") - exec_cmd = f"isort {MAIN_DIRECTORY}" - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - @task(name="format") def format_all(context: Context): """This will run all formatter.""" - format_isort(context) format_autoflake(context) - format_black(context) + format_ruff(context) print(f" - [{NAMESPACE}] All formatters have been executed!") @@ -82,27 +71,11 @@ def format_all(context: Context): # Testing tasks # ---------------------------------------------------------------------------- @task -def black(context: Context, docker: bool = False): - """Run black to check that Python files adherence to black standards.""" - - print(f" - [{NAMESPACE}] Check code with black") - exec_cmd = f"black --check --diff {MAIN_DIRECTORY}" - - if docker: - compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run {build_test_envs()} infrahub-test {exec_cmd}" - print(exec_cmd) - - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - -@task -def isort(context: Context, docker: bool = False): - """Run isort to check that Python files adherence to import standards.""" +def ruff(context: Context, docker: bool = False): + """Run ruff to check that Python files adherence to standards.""" - print(f" - [{NAMESPACE}] Check code with isort") - exec_cmd = f"isort --check --diff {MAIN_DIRECTORY}" + print(f" - [{NAMESPACE}] Check code with ruff") + exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix" if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) @@ -147,28 +120,10 @@ def pylint(context: Context, docker: bool = False): context.run(exec_cmd) -@task -def ruff(context: Context, docker: bool = False): - """This will run ruff.""" - - print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = f"ruff check {MAIN_DIRECTORY}" - - if docker: - compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run {build_test_envs()} infrahub-test {exec_cmd}" - print(exec_cmd) - - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - @task def lint(context: Context, docker: bool = False): """This will run all linter.""" ruff(context, docker=docker) - black(context, docker=docker) - isort(context, docker=docker) pylint(context, docker=docker) mypy(context, docker=docker) diff --git a/tasks/demo.py b/tasks/demo.py index 06edb7478a..f2420dcd0c 100644 --- a/tasks/demo.py +++ b/tasks/demo.py @@ -32,7 +32,11 @@ @task(optional=["database"]) def build( - context, service: str = None, python_ver: str = PYTHON_VER, nocache: bool = False, database: str = INFRAHUB_DATABASE + context, + service: str = None, + python_ver: str = PYTHON_VER, + nocache: bool = False, + database: str = INFRAHUB_DATABASE, ): # pylint: disable=too-many-arguments """Build an image with the provided name and python version. @@ -204,7 +208,10 @@ def infra_git_import(context: Context, database: str = INFRAHUB_DATABASE): @task(optional=["database"]) def infra_git_create( - context: Context, database: str = INFRAHUB_DATABASE, name="demo-edge", location="/remote/infrahub-demo-edge" + context: Context, + database: str = INFRAHUB_DATABASE, + name="demo-edge", + location="/remote/infrahub-demo-edge", ): """Load some demo data.""" clean_query = re.sub(r"\n\s*", "", ADD_REPO_QUERY) diff --git a/tasks/main.py b/tasks/main.py index e19f7a7c03..48b55ed240 100644 --- a/tasks/main.py +++ b/tasks/main.py @@ -1,6 +1,6 @@ from invoke import Context, task -from .utils import ESCAPED_REPO_PATH +from .utils import ESCAPED_REPO_PATH, REPO_BASE MAIN_DIRECTORY = "tasks" NAMESPACE = "MAIN" @@ -10,12 +10,12 @@ # Formatting tasks # ---------------------------------------------------------------------------- @task -def format_black(context: Context): - """Run black to format all Python files.""" +def format_ruff(context: Context): + """Run ruff to format all Python files.""" - print(f" - [{NAMESPACE}] Format code with black") + print(f" - [{NAMESPACE}] Format code with ruff") with context.cd(ESCAPED_REPO_PATH): - exec_cmd = f"black {MAIN_DIRECTORY}/ models/" + exec_cmd = f"ruff format {MAIN_DIRECTORY} models/ --config {REPO_BASE}/pyproject.toml" context.run(exec_cmd) @@ -29,22 +29,11 @@ def format_autoflake(context: Context): context.run(exec_cmd) -@task -def format_isort(context: Context): - """Run isort to format all Python files.""" - - print(f" - [{NAMESPACE}] Format code with isort") - with context.cd(ESCAPED_REPO_PATH): - exec_cmd = f"isort {MAIN_DIRECTORY} models" - context.run(exec_cmd) - - @task(name="format", default=True) def format_all(context: Context): """This will run all formatter.""" - format_isort(context) format_autoflake(context) - format_black(context) + format_ruff(context) print(f" - [{NAMESPACE}] All formatters have been executed!") diff --git a/tasks/sdk.py b/tasks/sdk.py index ea1b0e2327..bf00b4e267 100644 --- a/tasks/sdk.py +++ b/tasks/sdk.py @@ -11,7 +11,7 @@ execute_command, get_env_vars, ) -from .utils import ESCAPED_REPO_PATH +from .utils import ESCAPED_REPO_PATH, REPO_BASE MAIN_DIRECTORY = "python_sdk" NAMESPACE = "SDK" @@ -22,12 +22,12 @@ # Formatting tasks # ---------------------------------------------------------------------------- @task -def format_black(context: Context): - """Run black to format all Python files.""" +def format_ruff(context: Context): + """Run ruff to format all Python files.""" - print(f" - [{NAMESPACE}] Format code with black") - exec_cmd = "black ." - with context.cd(MAIN_DIRECTORY_PATH): + print(f" - [{NAMESPACE}] Format code with ruff") + exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" + with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) @@ -41,23 +41,12 @@ def format_autoflake(context: Context): context.run(exec_cmd) -@task -def format_isort(context: Context): - """Run isort to format all Python files.""" - - print(f" - [{NAMESPACE}] Format code with isort") - exec_cmd = "isort ." - with context.cd(MAIN_DIRECTORY_PATH): - context.run(exec_cmd) - - @task(name="format") def format_all(context: Context): """This will run all formatter.""" - format_isort(context) format_autoflake(context) - format_black(context) + format_ruff(context) print(f" - [{NAMESPACE}] All formatters have been executed!") @@ -66,34 +55,19 @@ def format_all(context: Context): # Testing tasks # ---------------------------------------------------------------------------- @task -def black(context: Context, docker: bool = False): - """Run black to check that Python files adherence to black standards.""" - - print(f" - [{NAMESPACE}] Check code with black") - exec_cmd = "black --check --diff ." - exec_directory = MAIN_DIRECTORY_PATH - - if docker: - compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run --workdir /source/{MAIN_DIRECTORY} infrahub-test {exec_cmd}" - exec_directory = ESCAPED_REPO_PATH - print(exec_cmd) - - with context.cd(exec_directory): - context.run(exec_cmd) - - -@task -def isort(context: Context, docker: bool = False): - """Run isort to check that Python files adherence to import standards.""" +def ruff(context: Context, docker: bool = False): + """Run ruff to check that Python files adherence to black standards.""" - print(f" - [{NAMESPACE}] Check code with isort") - exec_cmd = "isort --check --diff ." + print(f" - [{NAMESPACE}] Check code with ruff") + exec_cmd = "ruff check . --fix" exec_directory = MAIN_DIRECTORY_PATH if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run --workdir /source/{MAIN_DIRECTORY} infrahub-test {exec_cmd}" + exec_cmd = ( + f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME}" + f" run --workdir /source/{MAIN_DIRECTORY} infrahub-test {exec_cmd}" + ) exec_directory = ESCAPED_REPO_PATH print(exec_cmd) @@ -111,7 +85,10 @@ def mypy(context: Context, docker: bool = False): if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run --workdir /source/{MAIN_DIRECTORY} infrahub-test {exec_cmd}" + exec_cmd = ( + f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME}" + f" run --workdir /source/{MAIN_DIRECTORY} infrahub-test {exec_cmd}" + ) exec_directory = ESCAPED_REPO_PATH print(exec_cmd) @@ -129,25 +106,10 @@ def pylint(context: Context, docker: bool = False): if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run --workdir /source/{MAIN_DIRECTORY} infrahub-test {exec_cmd}" - exec_directory = ESCAPED_REPO_PATH - print(exec_cmd) - - with context.cd(exec_directory): - context.run(exec_cmd) - - -@task -def ruff(context: Context, docker: bool = False): - """This will run ruff.""" - - print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = "ruff check ." - exec_directory = MAIN_DIRECTORY_PATH - - if docker: - compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run --workdir /source/{MAIN_DIRECTORY} infrahub-test {exec_cmd}" + exec_cmd = ( + f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME}" + f" run --workdir /source/{MAIN_DIRECTORY} infrahub-test {exec_cmd}" + ) exec_directory = ESCAPED_REPO_PATH print(exec_cmd) @@ -159,8 +121,6 @@ def ruff(context: Context, docker: bool = False): def lint(context: Context, docker: bool = False): """This will run all linter.""" ruff(context, docker=docker) - black(context, docker=docker) - isort(context, docker=docker) pylint(context, docker=docker) mypy(context, docker=docker) diff --git a/tasks/shared.py b/tasks/shared.py index b57cc1bd8b..e3f5446950 100644 --- a/tasks/shared.py +++ b/tasks/shared.py @@ -93,7 +93,13 @@ class DatabaseType(str, Enum): PLATFORMS_PTY_ENABLE = ["Linux", "Darwin"] PLATFORMS_SUDO_DETECT = ["Linux"] -VOLUME_NAMES = ["database_data", "database_logs", "git_data", "git_remote_data", "storage_data"] +VOLUME_NAMES = [ + "database_data", + "database_logs", + "git_data", + "git_remote_data", + "storage_data", +] GITHUB_ENVS_TO_PASS = [ "GITHUB_ACTION", diff --git a/tasks/sync.py b/tasks/sync.py index 4c25f3de35..3e695d94e5 100644 --- a/tasks/sync.py +++ b/tasks/sync.py @@ -8,7 +8,7 @@ execute_command, get_env_vars, ) -from .utils import ESCAPED_REPO_PATH +from .utils import ESCAPED_REPO_PATH, REPO_BASE MAIN_DIRECTORY = "sync/infrahub-sync" NAMESPACE = "SYNC" @@ -18,11 +18,11 @@ # Formatting tasks # ---------------------------------------------------------------------------- @task -def format_black(context: Context): - """Run black to format all Python files.""" +def format_ruff(context: Context): + """Run ruff to format all Python files.""" - print(f" - [{NAMESPACE}] Format code with black") - exec_cmd = f"black {MAIN_DIRECTORY}/" + print(f" - [{NAMESPACE}] Format code with ruff") + exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) @@ -37,23 +37,12 @@ def format_autoflake(context: Context): context.run(exec_cmd) -@task -def format_isort(context: Context): - """Run isort to format all Python files.""" - - print(f" - [{NAMESPACE}] Format code with isort") - exec_cmd = f"isort {MAIN_DIRECTORY}" - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - @task(name="format") def format_all(context: Context): """This will run all formatter.""" - format_isort(context) format_autoflake(context) - format_black(context) + format_ruff(context) print(f" - [{NAMESPACE}] All formatters have been executed!") @@ -62,29 +51,11 @@ def format_all(context: Context): # Testing tasks # ---------------------------------------------------------------------------- @task -def black(context: Context, docker: bool = False): - """Run black to check that Python files adherence to black standards.""" - - print(f" - [{NAMESPACE}] Check code with black") - exec_cmd = f"black --check --diff {MAIN_DIRECTORY}" - - if docker: - compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = ( - f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run infrahub-test {exec_cmd}" - ) - print(exec_cmd) - - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - -@task -def isort(context: Context, docker: bool = False): - """Run isort to check that Python files adherence to import standards.""" +def ruff(context: Context, docker: bool = False): + """Run ruff to check that Python files adherence to black standards.""" - print(f" - [{NAMESPACE}] Check code with isort") - exec_cmd = f"isort --check --diff {MAIN_DIRECTORY}" + print(f" - [{NAMESPACE}] Check code with ruff") + exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix" if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) @@ -133,30 +104,10 @@ def pylint(context: Context, docker: bool = False): context.run(exec_cmd) -@task -def ruff(context: Context, docker: bool = False): - """This will run ruff.""" - - print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = f"ruff check {MAIN_DIRECTORY}" - - if docker: - compose_files_cmd = build_test_compose_files_cmd(database=False) - exec_cmd = ( - f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME} run infrahub-test {exec_cmd}" - ) - print(exec_cmd) - - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - @task def lint(context: Context, docker: bool = False): """This will run all linter.""" ruff(context, docker=docker) - black(context, docker=docker) - isort(context, docker=docker) pylint(context, docker=docker) # mypy(context, docker=docker) From 94dd7cd2781844d62b94d490f535ef239d4c83f1 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 11:12:29 +0100 Subject: [PATCH 008/446] run ruff check + ruff format --- backend/infrahub/api/diff.py | 10 +++-- backend/infrahub/cli/db.py | 2 +- backend/infrahub/core/branch.py | 20 +++++++--- backend/infrahub/core/initialization.py | 6 ++- backend/infrahub/core/query/attribute.py | 15 ++----- backend/infrahub/core/query/diff.py | 16 ++------ backend/infrahub/core/query/node.py | 11 ++--- backend/infrahub/core/query/relationship.py | 4 +- backend/infrahub/core/query/standard_node.py | 12 ++---- backend/infrahub/core/schema.py | 5 ++- backend/infrahub/core/utils.py | 7 +--- backend/infrahub/git/repository.py | 6 +-- backend/infrahub/graphql/__init__.py | 3 +- backend/infrahub/graphql/generator.py | 4 +- .../graphql/mutations/artifact_definition.py | 4 +- .../graphql/mutations/graphql_query.py | 4 +- backend/infrahub/graphql/mutations/main.py | 4 +- .../graphql/mutations/proposed_change.py | 4 +- .../infrahub/graphql/mutations/repository.py | 4 +- backend/infrahub/graphql/queries/diff.py | 3 +- backend/infrahub/graphql/types/node.py | 4 +- backend/infrahub/graphql/types/union.py | 4 +- backend/infrahub/graphql/utils.py | 2 +- .../message_bus/operations/event/node.py | 3 +- .../operations/requests/proposed_change.py | 3 +- .../tests/unit/graphql/test_graphql_branch.py | 11 ++--- .../test_mutation_artifact_definition.py | 4 +- .../graphql/test_mutation_graphqlquery.py | 16 ++------ .../unit/graphql/test_mutation_update.py | 4 +- python_sdk/tests/unit/sdk/test_client.py | 40 +++++-------------- python_sdk/tests/unit/sdk/test_node.py | 4 +- sync/diffsync/diffsync/__init__.py | 6 +-- sync/diffsync/diffsync/diff.py | 6 +-- sync/diffsync/diffsync/helpers.py | 11 ++--- sync/diffsync/diffsync/store/__init__.py | 12 ++++-- sync/diffsync/diffsync/store/local.py | 5 +-- sync/diffsync/diffsync/store/redis.py | 6 +-- sync/diffsync/diffsync/utils.py | 2 +- sync/diffsync/tests/unit/conftest.py | 3 +- sync/diffsync/tests/unit/test_diff.py | 1 - sync/diffsync/tests/unit/test_diffsync.py | 5 +-- .../tests/unit/test_diffsync_model.py | 3 +- .../tests/unit/test_diffsync_model_flags.py | 1 - sync/diffsync/tests/unit/test_examples.py | 2 +- sync/diffsync/tests/unit/test_redisstore.py | 2 +- .../infrahub_sync/adapters/infrahub.py | 4 +- .../infrahub_sync/adapters/netbox.py | 3 +- sync/infrahub-sync/infrahub_sync/cli.py | 5 ++- .../infrahub_sync/generator/__init__.py | 1 + utilities/db_compare_query.py | 6 +-- 50 files changed, 127 insertions(+), 196 deletions(-) diff --git a/backend/infrahub/api/diff.py b/backend/infrahub/api/diff.py index 612a584685..f1702812dd 100644 --- a/backend/infrahub/api/diff.py +++ b/backend/infrahub/api/diff.py @@ -13,10 +13,12 @@ from infrahub import config from infrahub.api.dependencies import get_branch_dep, get_current_user, get_db from infrahub.core import get_branch, registry -from infrahub.core.branch import Branch # noqa: TCH001 -from infrahub.core.branch import Diff # noqa: TCH001 -from infrahub.core.branch import NodeDiffElement # noqa: TCH001 -from infrahub.core.branch import RelationshipDiffElement # noqa: TCH001 +from infrahub.core.branch import ( + Branch, # noqa: TCH001 + Diff, # noqa: TCH001 + NodeDiffElement, # noqa: TCH001 + RelationshipDiffElement, # noqa: TCH001 +) from infrahub.core.constants import ( BranchSupportType, DiffAction, diff --git a/backend/infrahub/cli/db.py b/backend/infrahub/cli/db.py index 1f4c5dcd6c..867ad5b25f 100644 --- a/backend/infrahub/cli/db.py +++ b/backend/infrahub/cli/db.py @@ -71,7 +71,7 @@ async def _load_test_data(dataset: str) -> None: def init( config_file: str = typer.Option( "infrahub.toml", envvar="INFRAHUB_CONFIG", help="Location of the configuration file to use for Infrahub" - ) + ), ) -> None: """Erase the content of the database and initialize it with the core schema.""" diff --git a/backend/infrahub/core/branch.py b/backend/infrahub/core/branch.py index f837452bb1..c212c809a4 100644 --- a/backend/infrahub/core/branch.py +++ b/backend/infrahub/core/branch.py @@ -246,7 +246,9 @@ async def delete(self, db: InfrahubDatabase) -> None: def get_query_filter_relationships( self, rel_labels: list, at: Optional[Union[Timestamp, str]] = None, include_outside_parentheses: bool = False ) -> Tuple[List, Dict]: - """Generate a CYPHER Query filter based on a list of relationships to query a part of the graph at a specific time and on a specific branch.""" + """ + Generate a CYPHER Query filter based on a list of relationships to query a part of the graph at a specific time and on a specific branch. + """ filters = [] params = {} @@ -279,7 +281,8 @@ def get_query_filter_relationships( return filters, params def get_query_filter_path(self, at: Optional[Union[Timestamp, str]] = None) -> Tuple[str, Dict]: - """Generate a CYPHER Query filter based on a path to query a part of the graph at a specific time and on a specific branch. + """ + Generate a CYPHER Query filter based on a path to query a part of the graph at a specific time and on a specific branch. Examples: >>> rels_filter, rels_params = self.branch.get_query_filter_path(at=self.at) @@ -355,7 +358,8 @@ def get_query_filter_relationships_diff( diff_from: Timestamp, diff_to: Timestamp, ) -> Tuple[List, Dict]: - """Generate a CYPHER Query filter to query all events that are applicable to a given branch based + """ + Generate a CYPHER Query filter to query all events that are applicable to a given branch based - The time when the branch as created - The branched_from time of the branch - The diff_to and diff_from time as provided @@ -398,7 +402,8 @@ def get_query_filter_range( start_time: Union[Timestamp, str], end_time: Union[Timestamp, str], ) -> Tuple[List, Dict]: - """Generate a CYPHER Query filter to query a range of values in the graph between start_time and end_time.""" + """ + Generate a CYPHER Query filter to query a range of values in the graph between start_time and end_time.""" filters = [] params = {} @@ -443,7 +448,8 @@ async def rebase(self, db: InfrahubDatabase, at: Optional[Union[str, Timestamp]] registry.branch[self.name] = self async def validate_branch(self, db: InfrahubDatabase) -> List[ObjectConflict]: - """Validate if a branch is eligible to be merged. + """ + Validate if a branch is eligible to be merged. - Must be conflict free both for data and repository - All checks must pass - Check schema changes @@ -1082,7 +1088,9 @@ async def init( ) async def has_conflict( - self, db: InfrahubDatabase, rpc_client: InfrahubRpcClient # pylint: disable=unused-argument + self, + db: InfrahubDatabase, + rpc_client: InfrahubRpcClient, # pylint: disable=unused-argument ) -> bool: """Return True if the same path has been modified on multiple branches. False otherwise""" diff --git a/backend/infrahub/core/initialization.py b/backend/infrahub/core/initialization.py index 41a4136eb8..2e2ae9402b 100644 --- a/backend/infrahub/core/initialization.py +++ b/backend/infrahub/core/initialization.py @@ -67,7 +67,8 @@ async def initialization(db: InfrahubDatabase): await registry.schema.load_schema_from_db(db=db, branch=default_branch) if default_branch.update_schema_hash(): LOGGER.warning( - f"{default_branch.name} | New schema detected after pulling the schema from the db : {hash_in_db!r} >> {default_branch.schema_hash.main!r}" + f"{default_branch.name} | New schema detected after pulling the schema from the db :" + f" {hash_in_db!r} >> {default_branch.schema_hash.main!r}" ) for branch in branches: @@ -80,7 +81,8 @@ async def initialization(db: InfrahubDatabase): if branch.update_schema_hash(): LOGGER.warning( - f"{branch.name} | New schema detected after pulling the schema from the db {hash_in_db!r} >> {branch.schema_hash.main!r}" + f"{branch.name} | New schema detected after pulling the schema from the db :" + f" {hash_in_db!r} >> {branch.schema_hash.main!r}" ) # --------------------------------------------------- diff --git a/backend/infrahub/core/query/attribute.py b/backend/infrahub/core/query/attribute.py index aa95f193ff..14fbd40a54 100644 --- a/backend/infrahub/core/query/attribute.py +++ b/backend/infrahub/core/query/attribute.py @@ -57,9 +57,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): MATCH (a { uuid: $attr_uuid }) MATCH (a)-[r:HAS_VALUE]-(av) WHERE %s - """ % ( - "\n AND ".join(rels_filter), - ) + """ % ("\n AND ".join(rels_filter),) self.add_to_query(query) @@ -140,14 +138,11 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): self.params["flag_value"] = getattr(self.attr, self.flag_name) self.params["flag_type"] = self.attr.get_kind() - query = ( - """ + query = """ MATCH (a { uuid: $attr_uuid }) MERGE (flag:Boolean { value: $flag_value }) CREATE (a)-[r:%s { branch: $branch, branch_level: $branch_level, status: "active", from: $at, to: null }]->(flag) - """ - % self.flag_name.upper() - ) + """ % self.flag_name.upper() self.add_to_query(query) self.return_labels = ["a", "flag", "r"] @@ -215,9 +210,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): MATCH (a { uuid: $attr_uuid }) MATCH (n)-[r1]-(a)-[r2:HAS_VALUE|IS_VISIBLE|IS_PROTECTED|HAS_SOURCE|HAS_OWNER]-(ap) WHERE %s - """ % ( - "\n AND ".join(rels_filter), - ) + """ % ("\n AND ".join(rels_filter),) self.add_to_query(query) diff --git a/backend/infrahub/core/query/diff.py b/backend/infrahub/core/query/diff.py index a88e2d2af7..97a38f998f 100644 --- a/backend/infrahub/core/query/diff.py +++ b/backend/infrahub/core/query/diff.py @@ -288,9 +288,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): r3.branch IN $branch_names AND r3.from >= $diff_from AND r3.from <= $diff_to AND ((r3.to >= $diff_from AND r3.to <= $diff_to) OR r3.to is NULL) ) - """ % "\n AND ".join( - rels_filter - ) + """ % "\n AND ".join(rels_filter) self.add_to_query(query) self.params["branch_names"] = self.branch_names @@ -332,9 +330,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): MATCH (a) WHERE a.uuid IN $ids MATCH (a)-[r:IS_VISIBLE|IS_PROTECTED|HAS_SOURCE|HAS_OWNER|HAS_VALUE]-(ap) WHERE %s - """ % ( - "\n AND ".join(rels_filter), - ) + """ % ("\n AND ".join(rels_filter),) self.add_to_query(query) self.return_labels = ["a", "ap", "r"] @@ -386,9 +382,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): MATCH (a) WHERE a.uuid IN $ids MATCH (a)-[r:IS_VISIBLE|IS_PROTECTED|HAS_SOURCE|HAS_OWNER|HAS_VALUE]-(ap) WHERE %s - """ % ( - "\n AND ".join(rels_filter), - ) + """ % ("\n AND ".join(rels_filter),) self.add_to_query(query) self.return_labels = ["a", "ap", "r"] @@ -444,9 +438,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): MATCH (rl) WHERE rl.uuid IN $ids MATCH (rl)-[r:IS_VISIBLE|IS_PROTECTED|HAS_SOURCE|HAS_OWNER]-(rp) WHERE %s - """ % ( - "\n AND ".join(rels_filter), - ) + """ % ("\n AND ".join(rels_filter),) self.params["at"] = self.at.to_string() diff --git a/backend/infrahub/core/query/node.py b/backend/infrahub/core/query/node.py index 1350f0d4ed..bc2a84f319 100644 --- a/backend/infrahub/core/query/node.py +++ b/backend/infrahub/core/query/node.py @@ -178,9 +178,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): ) WITH distinct n MATCH (n)-[:HAS_ATTRIBUTE|IS_RELATED]-(rn)-[:HAS_VALUE|IS_RELATED]-(rv) - """ % ":".join( - self.node.get_labels() - ) + """ % ":".join(self.node.get_labels()) self.params["at"] = at.to_string() @@ -275,14 +273,11 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): self.params.update(rel_params) - query = ( - """ + query = """ MATCH (a:Attribute) WHERE a.uuid IN $attrs_ids MATCH (a)-[r1:HAS_VALUE]-(av:AttributeValue) WHERE %s - """ - % rel_filter[0] - ) + """ % rel_filter[0] self.add_to_query(query) self.return_labels = ["a", "av", "r1"] diff --git a/backend/infrahub/core/query/relationship.py b/backend/infrahub/core/query/relationship.py index 80a642c9e5..2453245d6d 100644 --- a/backend/infrahub/core/query/relationship.py +++ b/backend/infrahub/core/query/relationship.py @@ -508,9 +508,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): # pylint: di MATCH (rl)-[rel_is_visible:IS_VISIBLE]-(is_visible) MATCH (rl)-[rel_is_protected:IS_PROTECTED]-(is_protected) WHERE all(r IN [ rel_is_visible, rel_is_protected] WHERE (%s)) - """ % ( - branch_filter, - ) + """ % (branch_filter,) self.add_to_query(query) diff --git a/backend/infrahub/core/query/standard_node.py b/backend/infrahub/core/query/standard_node.py index a9510a46ef..f5f866ad36 100644 --- a/backend/infrahub/core/query/standard_node.py +++ b/backend/infrahub/core/query/standard_node.py @@ -42,9 +42,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): query = """ CREATE (n:%s $node_prop) - """ % ( - node_type - ) + """ % (node_type) self.add_to_query(query=query) self.return_labels = ["n"] @@ -64,9 +62,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): query = """ MATCH (n:%s { uuid: $uuid }) SET n = $node_prop - """ % ( - self.node.get_type(), - ) + """ % (self.node.get_type(),) self.add_to_query(query=query) self.return_labels = ["n"] @@ -82,9 +78,7 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): query = """ MATCH (n:%s { uuid: $uuid }) DETACH DELETE (n) - """ % ( - self.node.get_type() - ) + """ % (self.node.get_type()) self.params["uuid"] = str(self.node_id) self.add_to_query(query) diff --git a/backend/infrahub/core/schema.py b/backend/infrahub/core/schema.py index 323459d3b5..fd0866b28a 100644 --- a/backend/infrahub/core/schema.py +++ b/backend/infrahub/core/schema.py @@ -286,7 +286,10 @@ def get_class(self): return ATTRIBUTE_TYPES[self.kind].get_infrahub_class() async def get_query_filter( - self, db: InfrahubDatabase, *args, **kwargs # pylint: disable=unused-argument + self, + db: InfrahubDatabase, + *args, + **kwargs, # pylint: disable=unused-argument ) -> Tuple[List[QueryElement], Dict[str, Any], List[str]]: return await self.get_class().get_query_filter(*args, **kwargs) diff --git a/backend/infrahub/core/utils.py b/backend/infrahub/core/utils.py index d97f7b3d1a..5948d9466e 100644 --- a/backend/infrahub/core/utils.py +++ b/backend/infrahub/core/utils.py @@ -21,16 +21,13 @@ async def add_relationship( at: Optional[Timestamp] = None, status=RelationshipStatus.ACTIVE, ): - create_rel_query = ( - """ + create_rel_query = """ MATCH (s) WHERE ID(s) = $src_node_id MATCH (d) WHERE ID(d) = $dst_node_id WITH s,d CREATE (s)-[r:%s { branch: $branch, branch_level: $branch_level, from: $at, to: null, status: $status }]->(d) RETURN ID(r) - """ - % str(rel_type).upper() - ) + """ % str(rel_type).upper() at = Timestamp(at) diff --git a/backend/infrahub/git/repository.py b/backend/infrahub/git/repository.py index e2662a2551..4f203badef 100644 --- a/backend/infrahub/git/repository.py +++ b/backend/infrahub/git/repository.py @@ -25,9 +25,8 @@ ValidationError, ) from infrahub_sdk.utils import YamlFile, compare_lists -from pydantic import BaseModel +from pydantic import BaseModel, validator from pydantic import ValidationError as PydanticValidationError -from pydantic import validator import infrahub.config as config from infrahub.checks import INFRAHUB_CHECK_VARIABLE_TO_IMPORT, InfrahubCheck @@ -220,7 +219,8 @@ def extract_repo_file_information( Args: full_filename (str): Absolute path to the file to load Example:/opt/infrahub/git/repo01/commits/71da[..]4b7/myfile.py root_directory: Absolute path to the root of the repository directory. Example:/opt/infrahub/git/repo01 - worktree_directory (str, optional): Absolute path to the root of the worktree directory. Defaults to None. example: /opt/infrahub/git/repo01/commits/71da[..]4b7/ + worktree_directory (str, optional): Absolute path to the root of the worktree directory. Defaults to None. + Example: /opt/infrahub/git/repo01/commits/71da[..]4b7/ Returns: RepoFileInformation: Pydantic object to store all information about this file diff --git a/backend/infrahub/graphql/__init__.py b/backend/infrahub/graphql/__init__.py index d8fda35f75..e6ef406904 100644 --- a/backend/infrahub/graphql/__init__.py +++ b/backend/infrahub/graphql/__init__.py @@ -54,7 +54,8 @@ class Mutation(InfrahubBaseMutation, MutationMixin): async def get_gql_subscription( - db: InfrahubDatabase, branch: Union[Branch, str] = None # pylint: disable=unused-argument + db: InfrahubDatabase, + branch: Union[Branch, str] = None, # pylint: disable=unused-argument ) -> type[InfrahubBaseSubscription]: class Subscription(InfrahubBaseSubscription): pass diff --git a/backend/infrahub/graphql/generator.py b/backend/infrahub/graphql/generator.py index ce83e4468a..72f0119214 100644 --- a/backend/infrahub/graphql/generator.py +++ b/backend/infrahub/graphql/generator.py @@ -56,9 +56,7 @@ def load_node_interface(branch: Branch): registry.set_graphql_type(name=paginated_interface._meta.name, graphql_type=paginated_interface, branch=branch.name) -async def generate_object_types( - db: InfrahubDatabase, branch: Union[Branch, str] -): # pylint: disable=too-many-branches,too-many-statements +async def generate_object_types(db: InfrahubDatabase, branch: Union[Branch, str]): # pylint: disable=too-many-branches,too-many-statements """Generate all GraphQL objects for the schema and store them in the internal registry.""" branch = await get_branch(db=db, branch=branch) diff --git a/backend/infrahub/graphql/mutations/artifact_definition.py b/backend/infrahub/graphql/mutations/artifact_definition.py index c35cb37ff5..a22cdb5de7 100644 --- a/backend/infrahub/graphql/mutations/artifact_definition.py +++ b/backend/infrahub/graphql/mutations/artifact_definition.py @@ -23,9 +23,7 @@ class InfrahubArtifactDefinitionMutation(InfrahubMutationMixin, Mutation): @classmethod - def __init_subclass_with_meta__( - cls, schema: NodeSchema = None, _meta=None, **options - ): # pylint: disable=arguments-differ + def __init_subclass_with_meta__(cls, schema: NodeSchema = None, _meta=None, **options): # pylint: disable=arguments-differ # Make sure schema is a valid NodeSchema Node Class if not isinstance(schema, NodeSchema): raise ValueError(f"You need to pass a valid NodeSchema in '{cls.__name__}.Meta', received '{schema}'") diff --git a/backend/infrahub/graphql/mutations/graphql_query.py b/backend/infrahub/graphql/mutations/graphql_query.py index 0de387b285..72e8ed17f1 100644 --- a/backend/infrahub/graphql/mutations/graphql_query.py +++ b/backend/infrahub/graphql/mutations/graphql_query.py @@ -15,9 +15,7 @@ class InfrahubGraphQLQueryMutation(InfrahubMutationMixin, Mutation): @classmethod - def __init_subclass_with_meta__( - cls, schema: NodeSchema = None, _meta=None, **options - ): # pylint: disable=arguments-differ + def __init_subclass_with_meta__(cls, schema: NodeSchema = None, _meta=None, **options): # pylint: disable=arguments-differ # Make sure schema is a valid NodeSchema Node Class if not isinstance(schema, NodeSchema): raise ValueError(f"You need to pass a valid NodeSchema in '{cls.__name__}.Meta', received '{schema}'") diff --git a/backend/infrahub/graphql/mutations/main.py b/backend/infrahub/graphql/mutations/main.py index 7795f1ade1..318c51c54d 100644 --- a/backend/infrahub/graphql/mutations/main.py +++ b/backend/infrahub/graphql/mutations/main.py @@ -221,9 +221,7 @@ async def validate_constraints(cls, db: InfrahubDatabase, node: Node, branch: Op class InfrahubMutation(InfrahubMutationMixin, Mutation): @classmethod - def __init_subclass_with_meta__( - cls, schema: NodeSchema = None, _meta=None, **options - ): # pylint: disable=arguments-differ + def __init_subclass_with_meta__(cls, schema: NodeSchema = None, _meta=None, **options): # pylint: disable=arguments-differ # Make sure schema is a valid NodeSchema Node Class if not isinstance(schema, NodeSchema): raise ValueError(f"You need to pass a valid NodeSchema in '{cls.__name__}.Meta', received '{schema}'") diff --git a/backend/infrahub/graphql/mutations/proposed_change.py b/backend/infrahub/graphql/mutations/proposed_change.py index 2e50b565e4..b99843ec0a 100644 --- a/backend/infrahub/graphql/mutations/proposed_change.py +++ b/backend/infrahub/graphql/mutations/proposed_change.py @@ -34,9 +34,7 @@ class CheckType(Enum): class InfrahubProposedChangeMutation(InfrahubMutationMixin, Mutation): @classmethod - def __init_subclass_with_meta__( - cls, schema: NodeSchema = None, _meta=None, **options - ): # pylint: disable=arguments-differ + def __init_subclass_with_meta__(cls, schema: NodeSchema = None, _meta=None, **options): # pylint: disable=arguments-differ # Make sure schema is a valid NodeSchema Node Class if not isinstance(schema, NodeSchema): raise ValueError(f"You need to pass a valid NodeSchema in '{cls.__name__}.Meta', received '{schema}'") diff --git a/backend/infrahub/graphql/mutations/repository.py b/backend/infrahub/graphql/mutations/repository.py index ed30281a53..845912f9ec 100644 --- a/backend/infrahub/graphql/mutations/repository.py +++ b/backend/infrahub/graphql/mutations/repository.py @@ -21,9 +21,7 @@ class InfrahubRepositoryMutation(InfrahubMutationMixin, Mutation): @classmethod - def __init_subclass_with_meta__( - cls, schema: NodeSchema = None, _meta=None, **options - ): # pylint: disable=arguments-differ + def __init_subclass_with_meta__(cls, schema: NodeSchema = None, _meta=None, **options): # pylint: disable=arguments-differ # Make sure schema is a valid NodeSchema Node Class if not isinstance(schema, NodeSchema): raise ValueError(f"You need to pass a valid NodeSchema in '{cls.__name__}.Meta', received '{schema}'") diff --git a/backend/infrahub/graphql/queries/diff.py b/backend/infrahub/graphql/queries/diff.py index 6c1a183aee..07022bc4bb 100644 --- a/backend/infrahub/graphql/queries/diff.py +++ b/backend/infrahub/graphql/queries/diff.py @@ -1,8 +1,7 @@ from __future__ import annotations -from typing import TYPE_CHECKING, Any, Dict +from typing import TYPE_CHECKING, Any, Dict, Optional, Union from typing import List as TypingList -from typing import Optional, Union from graphene import Boolean, Field, List, ObjectType, String diff --git a/backend/infrahub/graphql/types/node.py b/backend/infrahub/graphql/types/node.py index bfab303e0b..5673b83615 100644 --- a/backend/infrahub/graphql/types/node.py +++ b/backend/infrahub/graphql/types/node.py @@ -14,9 +14,7 @@ class InfrahubObjectOptions(ObjectTypeOptions): class InfrahubObject(ObjectType, GetListMixin): @classmethod - def __init_subclass_with_meta__( - cls, schema: NodeSchema = None, interfaces=(), _meta=None, **options - ): # pylint: disable=arguments-differ + def __init_subclass_with_meta__(cls, schema: NodeSchema = None, interfaces=(), _meta=None, **options): # pylint: disable=arguments-differ if not isinstance(schema, (NodeSchema, GenericSchema)): raise ValueError(f"You need to pass a valid NodeSchema in '{cls.__name__}.Meta', received '{schema}'") diff --git a/backend/infrahub/graphql/types/union.py b/backend/infrahub/graphql/types/union.py index 131bd38e36..ead49fc736 100644 --- a/backend/infrahub/graphql/types/union.py +++ b/backend/infrahub/graphql/types/union.py @@ -19,9 +19,7 @@ class Meta: types = ("PlaceHolder",) @classmethod - def __init_subclass_with_meta__( - cls, schema: GroupSchema = None, types=(), _meta=None, **options - ): # pylint: disable=arguments-renamed + def __init_subclass_with_meta__(cls, schema: GroupSchema = None, types=(), _meta=None, **options): # pylint: disable=arguments-renamed if not isinstance(schema, GroupSchema): raise ValueError(f"You need to pass a valid GroupSchema in '{cls.__name__}.Meta', received '{schema}'") diff --git a/backend/infrahub/graphql/utils.py b/backend/infrahub/graphql/utils.py index 5f42dffaa9..19ff092e7f 100644 --- a/backend/infrahub/graphql/utils.py +++ b/backend/infrahub/graphql/utils.py @@ -270,7 +270,7 @@ def print_selection_set(selection_set: SelectionSetNode, level: int = 1) -> int: # print(f"in print_selection_set loop {field}") # The field we are at is already a lever deeper, even if it doesn't have its own selection set. # max_depth = max(max_depth, level + 1) - print(f"{level*tab}{field.name.value}") + print(f"{level * tab}{field.name.value}") if selection_set := getattr(field, "selection_set", None): # max_depth = max(max_depth, self._get_query_depth(selection_set, level + 1)) print_selection_set(selection_set, level + 1) diff --git a/backend/infrahub/message_bus/operations/event/node.py b/backend/infrahub/message_bus/operations/event/node.py index a8fa02fa3c..60573179c1 100644 --- a/backend/infrahub/message_bus/operations/event/node.py +++ b/backend/infrahub/message_bus/operations/event/node.py @@ -6,7 +6,8 @@ async def mutated( - message: messages.EventNodeMutated, service: InfrahubServices # pylint: disable=unused-argument + message: messages.EventNodeMutated, + service: InfrahubServices, # pylint: disable=unused-argument ) -> None: log.debug( "Mutation on node", diff --git a/backend/infrahub/message_bus/operations/requests/proposed_change.py b/backend/infrahub/message_bus/operations/requests/proposed_change.py index 688c905bcd..e2a7f05b02 100644 --- a/backend/infrahub/message_bus/operations/requests/proposed_change.py +++ b/backend/infrahub/message_bus/operations/requests/proposed_change.py @@ -134,7 +134,8 @@ async def data_integrity(message: messages.RequestProposedChangeDataIntegrity, s async def schema_integrity( - message: messages.RequestProposedChangeSchemaIntegrity, service: InfrahubServices # pylint: disable=unused-argument + message: messages.RequestProposedChangeSchemaIntegrity, + service: InfrahubServices, # pylint: disable=unused-argument ) -> None: log.info(f"Got a request to process schema integrity defined in proposed_change: {message.proposed_change}") diff --git a/backend/tests/unit/graphql/test_graphql_branch.py b/backend/tests/unit/graphql/test_graphql_branch.py index 96161dcfcc..85cd911754 100644 --- a/backend/tests/unit/graphql/test_graphql_branch.py +++ b/backend/tests/unit/graphql/test_graphql_branch.py @@ -208,17 +208,14 @@ async def test_branch_query( root_value=None, variable_values={}, ) - name_query = ( - """ + name_query = """ query { Branch(name: "%s" ) { id name } } - """ - % branch3["name"] - ) + """ % branch3["name"] name_response = await graphql( schema, source=name_query, @@ -233,9 +230,7 @@ async def test_branch_query( name } } - """ % [ - branch3["id"] - ] + """ % [branch3["id"]] id_query = id_query.replace("'", '"') id_response = await graphql( diff --git a/backend/tests/unit/graphql/test_mutation_artifact_definition.py b/backend/tests/unit/graphql/test_mutation_artifact_definition.py index fab7b11cdf..d946353a9a 100644 --- a/backend/tests/unit/graphql/test_mutation_artifact_definition.py +++ b/backend/tests/unit/graphql/test_mutation_artifact_definition.py @@ -142,9 +142,7 @@ async def test_update_artifact_definition( } } } - """ % ( - definition1.id - ) + """ % (definition1.id) result = await graphql( schema=await generate_graphql_schema(db=db, include_subscription=False, branch=branch), diff --git a/backend/tests/unit/graphql/test_mutation_graphqlquery.py b/backend/tests/unit/graphql/test_mutation_graphqlquery.py index 8db07e250c..1eb34caa9e 100644 --- a/backend/tests/unit/graphql/test_mutation_graphqlquery.py +++ b/backend/tests/unit/graphql/test_mutation_graphqlquery.py @@ -47,11 +47,7 @@ async def test_create_query_no_vars(db: InfrahubDatabase, default_branch, regist } } } - """ % query_value.replace( - "\n", " " - ).replace( - '"', '\\"' - ) + """ % query_value.replace("\n", " ").replace('"', '\\"') result = await graphql( schema=await generate_graphql_schema(db=db, include_subscription=False, branch=default_branch), @@ -116,11 +112,7 @@ async def test_create_query_with_vars(db: InfrahubDatabase, default_branch, regi } } } - """ % query_value.replace( - "\n", " " - ).replace( - '"', '\\"' - ) + """ % query_value.replace("\n", " ").replace('"', '\\"') result = await graphql( schema=await generate_graphql_schema(db=db, include_subscription=False, branch=default_branch), @@ -276,9 +268,7 @@ async def test_update_query_no_update(db: InfrahubDatabase, default_branch, regi } } } - """ % ( - obj.id - ) + """ % (obj.id) result = await graphql( schema=await generate_graphql_schema(db=db, include_subscription=False, branch=default_branch), diff --git a/backend/tests/unit/graphql/test_mutation_update.py b/backend/tests/unit/graphql/test_mutation_update.py index cea6164f67..2832ee8e08 100644 --- a/backend/tests/unit/graphql/test_mutation_update.py +++ b/backend/tests/unit/graphql/test_mutation_update.py @@ -353,9 +353,7 @@ async def test_update_delete_optional_relationship_cardinality_one( } } } - """ % ( - car_accord_main.id, - ) + """ % (car_accord_main.id,) result = await graphql( schema=await generate_graphql_schema(db=db, include_subscription=False, branch=branch), source=query, diff --git a/python_sdk/tests/unit/sdk/test_client.py b/python_sdk/tests/unit/sdk/test_client.py index 053716e29e..5e7316f7eb 100644 --- a/python_sdk/tests/unit/sdk/test_client.py +++ b/python_sdk/tests/unit/sdk/test_client.py @@ -70,9 +70,7 @@ async def test_init_with_invalid_address(): assert "The configured address is not a valid url" in str(exc.value) -async def test_get_repositories( - client, mock_branches_list_query, mock_repositories_query -): # pylint: disable=unused-argument +async def test_get_repositories(client, mock_branches_list_query, mock_repositories_query): # pylint: disable=unused-argument repos = await client.get_list_repositories() expected_response = RepositoryData( @@ -86,9 +84,7 @@ async def test_get_repositories( @pytest.mark.parametrize("client_type", client_types) -async def test_method_all_with_limit( - clients, mock_query_repository_page1_2, client_type -): # pylint: disable=unused-argument +async def test_method_all_with_limit(clients, mock_query_repository_page1_2, client_type): # pylint: disable=unused-argument if client_type == "standard": repos = await clients.standard.all(kind="CoreRepository", limit=3) assert not clients.standard.store._store["CoreRepository"] @@ -126,9 +122,7 @@ async def test_method_all_multiple_pages( @pytest.mark.parametrize("client_type", client_types) -async def test_method_all_single_page( - clients, mock_query_repository_page1_1, client_type -): # pylint: disable=unused-argument +async def test_method_all_single_page(clients, mock_query_repository_page1_1, client_type): # pylint: disable=unused-argument if client_type == "standard": repos = await clients.standard.all(kind="CoreRepository") assert not clients.standard.store._store["CoreRepository"] @@ -158,9 +152,7 @@ async def test_method_all_generic(clients, mock_query_corenode_page1_1, client_t @pytest.mark.parametrize("client_type", client_types) -async def test_method_get_by_id( - httpx_mock: HTTPXMock, clients, mock_schema_query_01, client_type -): # pylint: disable=unused-argument +async def test_method_get_by_id(httpx_mock: HTTPXMock, clients, mock_schema_query_01, client_type): # pylint: disable=unused-argument response = { "data": { "CoreRepository": { @@ -205,9 +197,7 @@ async def test_method_get_by_id( @pytest.mark.parametrize("client_type", client_types) -async def test_method_get_by_default_filter( - httpx_mock: HTTPXMock, clients, mock_schema_query_01, client_type -): # pylint: disable=unused-argument +async def test_method_get_by_default_filter(httpx_mock: HTTPXMock, clients, mock_schema_query_01, client_type): # pylint: disable=unused-argument response = { "data": { "CoreRepository": { @@ -252,9 +242,7 @@ async def test_method_get_by_default_filter( @pytest.mark.parametrize("client_type", client_types) -async def test_method_get_by_name( - httpx_mock: HTTPXMock, clients, mock_schema_query_01, client_type -): # pylint: disable=unused-argument +async def test_method_get_by_name(httpx_mock: HTTPXMock, clients, mock_schema_query_01, client_type): # pylint: disable=unused-argument response = { "data": { "CoreRepository": { @@ -289,9 +277,7 @@ async def test_method_get_by_name( @pytest.mark.parametrize("client_type", client_types) -async def test_method_get_not_found( - httpx_mock: HTTPXMock, clients, mock_query_repository_page1_empty, client_type -): # pylint: disable=unused-argument +async def test_method_get_not_found(httpx_mock: HTTPXMock, clients, mock_query_repository_page1_empty, client_type): # pylint: disable=unused-argument with pytest.raises(NodeNotFound): if client_type == "standard": await clients.standard.get(kind="CoreRepository", name__value="infrahub-demo-core") @@ -315,9 +301,7 @@ async def test_method_get_found_many( @pytest.mark.parametrize("client_type", client_types) -async def test_method_get_invalid_filter( - httpx_mock: HTTPXMock, clients, mock_schema_query_01, client_type -): # pylint: disable=unused-argument +async def test_method_get_invalid_filter(httpx_mock: HTTPXMock, clients, mock_schema_query_01, client_type): # pylint: disable=unused-argument with pytest.raises(FilterNotFound) as excinfo: if client_type == "standard": await clients.standard.get(kind="CoreRepository", name__name="infrahub-demo-core") @@ -330,9 +314,7 @@ async def test_method_get_invalid_filter( @pytest.mark.parametrize("client_type", client_types) -async def test_method_filters_many( - httpx_mock: HTTPXMock, clients, mock_query_repository_page1_1, client_type -): # pylint: disable=unused-argument +async def test_method_filters_many(httpx_mock: HTTPXMock, clients, mock_query_repository_page1_1, client_type): # pylint: disable=unused-argument if client_type == "standard": repos = await clients.standard.filters( kind="CoreRepository", @@ -378,9 +360,7 @@ async def test_method_filters_many( @pytest.mark.parametrize("client_type", client_types) -async def test_method_filters_empty( - httpx_mock: HTTPXMock, clients, mock_query_repository_page1_empty, client_type -): # pylint: disable=unused-argument +async def test_method_filters_empty(httpx_mock: HTTPXMock, clients, mock_query_repository_page1_empty, client_type): # pylint: disable=unused-argument if client_type == "standard": repos = await clients.standard.filters( kind="CoreRepository", diff --git a/python_sdk/tests/unit/sdk/test_node.py b/python_sdk/tests/unit/sdk/test_node.py index 0207124be2..7f1e5e1976 100644 --- a/python_sdk/tests/unit/sdk/test_node.py +++ b/python_sdk/tests/unit/sdk/test_node.py @@ -331,9 +331,7 @@ async def test_query_data_generic(clients, mock_schema_query_02, client_type): @pytest.mark.parametrize("client_type", client_types) -async def test_query_data_generic_fragment( - clients, mock_schema_query_02, client_type -): # pylint: disable=unused-argument +async def test_query_data_generic_fragment(clients, mock_schema_query_02, client_type): # pylint: disable=unused-argument if client_type == "standard": client: InfrahubClient = getattr(clients, client_type) # type: ignore[annotation-unchecked] corenode_schema: GenericSchema = await client.schema.get(kind="CoreNode") # type: ignore[annotation-unchecked] diff --git a/sync/diffsync/diffsync/__init__.py b/sync/diffsync/diffsync/__init__.py index 1f9bad81bf..4ebffc9b44 100644 --- a/sync/diffsync/diffsync/__init__.py +++ b/sync/diffsync/diffsync/__init__.py @@ -17,12 +17,12 @@ from inspect import isclass from typing import Callable, ClassVar, Dict, List, Mapping, Optional, Text, Tuple, Type, Union -from pydantic import BaseModel, PrivateAttr import structlog # type: ignore +from pydantic import BaseModel, PrivateAttr from diffsync.diff import Diff -from diffsync.enum import DiffSyncModelFlags, DiffSyncFlags, DiffSyncStatus -from diffsync.exceptions import DiffClassMismatch, ObjectAlreadyExists, ObjectStoreWrongType, ObjectNotFound +from diffsync.enum import DiffSyncFlags, DiffSyncModelFlags, DiffSyncStatus +from diffsync.exceptions import DiffClassMismatch, ObjectAlreadyExists, ObjectNotFound, ObjectStoreWrongType from diffsync.helpers import DiffSyncDiffer, DiffSyncSyncer from diffsync.store import BaseStore from diffsync.store.local import LocalStore diff --git a/sync/diffsync/diffsync/diff.py b/sync/diffsync/diffsync/diff.py index afeea436c8..9df81aa852 100644 --- a/sync/diffsync/diffsync/diff.py +++ b/sync/diffsync/diffsync/diff.py @@ -16,11 +16,11 @@ """ from functools import total_ordering -from typing import Any, Iterator, Iterable, Mapping, Optional, Text, Type +from typing import Any, Iterable, Iterator, Mapping, Optional, Text, Type -from .exceptions import ObjectAlreadyExists -from .utils import intersection, OrderedDefaultDict from .enum import DiffSyncActions +from .exceptions import ObjectAlreadyExists +from .utils import OrderedDefaultDict, intersection class Diff: diff --git a/sync/diffsync/diffsync/helpers.py b/sync/diffsync/diffsync/helpers.py index 07f249edce..ab3ba45d0f 100644 --- a/sync/diffsync/diffsync/helpers.py +++ b/sync/diffsync/diffsync/helpers.py @@ -14,15 +14,16 @@ See the License for the specific language governing permissions and limitations under the License. """ -from collections.abc import Iterable as ABCIterable, Mapping as ABCMapping -from typing import Callable, Iterable, List, Mapping, Optional, Tuple, Type, TYPE_CHECKING - import asyncio +from collections.abc import Iterable as ABCIterable +from collections.abc import Mapping as ABCMapping +from typing import TYPE_CHECKING, Callable, Iterable, List, Mapping, Optional, Tuple, Type + import structlog # type: ignore from .diff import Diff, DiffElement -from .enum import DiffSyncModelFlags, DiffSyncFlags, DiffSyncStatus, DiffSyncActions -from .exceptions import ObjectNotFound, ObjectNotCreated, ObjectNotUpdated, ObjectNotDeleted, ObjectCrudException +from .enum import DiffSyncActions, DiffSyncFlags, DiffSyncModelFlags, DiffSyncStatus +from .exceptions import ObjectCrudException, ObjectNotCreated, ObjectNotDeleted, ObjectNotFound, ObjectNotUpdated from .utils import intersection, symmetric_difference if TYPE_CHECKING: # pragma: no cover diff --git a/sync/diffsync/diffsync/store/__init__.py b/sync/diffsync/diffsync/store/__init__.py index aed32ede47..a8bc999d0f 100644 --- a/sync/diffsync/diffsync/store/__init__.py +++ b/sync/diffsync/diffsync/store/__init__.py @@ -1,19 +1,23 @@ """BaseStore module.""" -from typing import Dict, List, Mapping, Text, Tuple, Type, Union, TYPE_CHECKING, Optional, Set +from typing import TYPE_CHECKING, Dict, List, Mapping, Optional, Set, Text, Tuple, Type, Union + import structlog # type: ignore from diffsync.exceptions import ObjectNotFound if TYPE_CHECKING: - from diffsync import DiffSyncModel - from diffsync import DiffSync + from diffsync import DiffSync, DiffSyncModel class BaseStore: """Reference store to be implemented in different backends.""" def __init__( - self, *args, diffsync: Optional["DiffSync"] = None, name: str = "", **kwargs # pylint: disable=unused-argument + self, + *args, + diffsync: Optional["DiffSync"] = None, + name: str = "", + **kwargs, # pylint: disable=unused-argument ) -> None: """Init method for BaseStore.""" self.diffsync = diffsync diff --git a/sync/diffsync/diffsync/store/local.py b/sync/diffsync/diffsync/store/local.py index eaf6fe5a0e..387a3563dc 100644 --- a/sync/diffsync/diffsync/store/local.py +++ b/sync/diffsync/diffsync/store/local.py @@ -1,12 +1,11 @@ """LocalStore module.""" from collections import defaultdict -from typing import List, Mapping, Text, Type, Union, TYPE_CHECKING, Dict, Set +from typing import TYPE_CHECKING, Dict, List, Mapping, Set, Text, Type, Union -from diffsync.exceptions import ObjectNotFound, ObjectAlreadyExists +from diffsync.exceptions import ObjectAlreadyExists, ObjectNotFound from diffsync.store import BaseStore - if TYPE_CHECKING: from diffsync import DiffSyncModel diff --git a/sync/diffsync/diffsync/store/redis.py b/sync/diffsync/diffsync/store/redis.py index 83ca68cbb0..648fe70f9e 100644 --- a/sync/diffsync/diffsync/store/redis.py +++ b/sync/diffsync/diffsync/store/redis.py @@ -1,8 +1,8 @@ """RedisStore module.""" import copy import uuid -from pickle import loads, dumps # nosec -from typing import List, Mapping, Text, Type, Union, TYPE_CHECKING, Set +from pickle import dumps, loads # nosec +from typing import TYPE_CHECKING, List, Mapping, Set, Text, Type, Union try: from redis import Redis @@ -11,7 +11,7 @@ print("Redis is not installed. Have you installed diffsync with redis extra? `pip install diffsync[redis]`") raise ierr -from diffsync.exceptions import ObjectNotFound, ObjectStoreException, ObjectAlreadyExists +from diffsync.exceptions import ObjectAlreadyExists, ObjectNotFound, ObjectStoreException from diffsync.store import BaseStore if TYPE_CHECKING: diff --git a/sync/diffsync/diffsync/utils.py b/sync/diffsync/diffsync/utils.py index 94b0aa9483..d83ec676ed 100644 --- a/sync/diffsync/diffsync/utils.py +++ b/sync/diffsync/diffsync/utils.py @@ -16,7 +16,7 @@ """ from collections import OrderedDict -from typing import Iterator, List, Dict, Optional +from typing import Dict, Iterator, List, Optional SPACE = " " BRANCH = "│ " diff --git a/sync/diffsync/tests/unit/conftest.py b/sync/diffsync/tests/unit/conftest.py index 07b8fc8d8b..d43d7552af 100644 --- a/sync/diffsync/tests/unit/conftest.py +++ b/sync/diffsync/tests/unit/conftest.py @@ -17,10 +17,9 @@ from typing import ClassVar, List, Mapping, Optional, Tuple import pytest - from diffsync import DiffSync, DiffSyncModel from diffsync.diff import Diff, DiffElement -from diffsync.exceptions import ObjectNotCreated, ObjectNotUpdated, ObjectNotDeleted +from diffsync.exceptions import ObjectNotCreated, ObjectNotDeleted, ObjectNotUpdated @pytest.fixture diff --git a/sync/diffsync/tests/unit/test_diff.py b/sync/diffsync/tests/unit/test_diff.py index 63c6310269..b26172c458 100644 --- a/sync/diffsync/tests/unit/test_diff.py +++ b/sync/diffsync/tests/unit/test_diff.py @@ -16,7 +16,6 @@ """ import pytest - from diffsync.diff import Diff, DiffElement from diffsync.exceptions import ObjectAlreadyExists diff --git a/sync/diffsync/tests/unit/test_diffsync.py b/sync/diffsync/tests/unit/test_diffsync.py index ebf0e494ed..fce8b9ddbe 100644 --- a/sync/diffsync/tests/unit/test_diffsync.py +++ b/sync/diffsync/tests/unit/test_diffsync.py @@ -4,12 +4,11 @@ from unittest import mock import pytest - from diffsync import DiffSync, DiffSyncModel from diffsync.enum import DiffSyncFlags, DiffSyncModelFlags -from diffsync.exceptions import DiffClassMismatch, ObjectAlreadyExists, ObjectNotFound, ObjectCrudException +from diffsync.exceptions import DiffClassMismatch, ObjectAlreadyExists, ObjectCrudException, ObjectNotFound -from .conftest import Site, Device, Interface, TrackedDiff, BackendA, PersonA +from .conftest import BackendA, Device, Interface, PersonA, Site, TrackedDiff def test_diffsync_default_name_type(generic_diffsync): diff --git a/sync/diffsync/tests/unit/test_diffsync_model.py b/sync/diffsync/tests/unit/test_diffsync_model.py index 90ecbc85a6..7222e7fd76 100644 --- a/sync/diffsync/tests/unit/test_diffsync_model.py +++ b/sync/diffsync/tests/unit/test_diffsync_model.py @@ -18,10 +18,9 @@ from typing import List import pytest - from diffsync import DiffSyncModel from diffsync.enum import DiffSyncModelFlags -from diffsync.exceptions import ObjectStoreWrongType, ObjectAlreadyExists, ObjectNotFound +from diffsync.exceptions import ObjectAlreadyExists, ObjectNotFound, ObjectStoreWrongType from .conftest import Device, Interface diff --git a/sync/diffsync/tests/unit/test_diffsync_model_flags.py b/sync/diffsync/tests/unit/test_diffsync_model_flags.py index b7950aefaa..2156fd8ad9 100644 --- a/sync/diffsync/tests/unit/test_diffsync_model_flags.py +++ b/sync/diffsync/tests/unit/test_diffsync_model_flags.py @@ -16,7 +16,6 @@ """ import pytest - from diffsync.enum import DiffSyncModelFlags from diffsync.exceptions import ObjectNotFound diff --git a/sync/diffsync/tests/unit/test_examples.py b/sync/diffsync/tests/unit/test_examples.py index 2f9af164a3..3b95de90ec 100644 --- a/sync/diffsync/tests/unit/test_examples.py +++ b/sync/diffsync/tests/unit/test_examples.py @@ -15,8 +15,8 @@ limitations under the License. """ -from os.path import join, dirname import subprocess +from os.path import dirname, join EXAMPLES = join(dirname(dirname(dirname(__file__))), "examples") diff --git a/sync/diffsync/tests/unit/test_redisstore.py b/sync/diffsync/tests/unit/test_redisstore.py index 67d011d0a6..87f98dcaae 100644 --- a/sync/diffsync/tests/unit/test_redisstore.py +++ b/sync/diffsync/tests/unit/test_redisstore.py @@ -1,7 +1,7 @@ """Testing of RedisStore.""" import pytest -from diffsync.store.redis import RedisStore from diffsync.exceptions import ObjectStoreException +from diffsync.store.redis import RedisStore def _get_path_from_redisdb(redisdb_instance): diff --git a/sync/infrahub-sync/infrahub_sync/adapters/infrahub.py b/sync/infrahub-sync/infrahub_sync/adapters/infrahub.py index 6285deca55..bccc182f1f 100644 --- a/sync/infrahub-sync/infrahub_sync/adapters/infrahub.py +++ b/sync/infrahub-sync/infrahub_sync/adapters/infrahub.py @@ -30,7 +30,7 @@ def update_node(node: InfrahubNodeSync, attrs: dict): new_peer_ids = [node._client.store.get(key=value, kind=rel.peer).id for value in list(attr_value)] attr = getattr(node, attr_name) existing_peer_ids = attr.peer_ids - in_both, existing_only, new_only = compare_lists(existing_peer_ids, new_peer_ids) + in_both, existing_only, new_only = compare_lists(existing_peer_ids, new_peer_ids) # noqa: F841 for id in existing_only: attr.remove(id) @@ -90,7 +90,7 @@ def infrahub_node_to_diffsync(self, node: InfrahubNodeSync) -> dict: # Is it the right place to do it or are we missing some de-serialize ? # got a ValidationError from pydantic while trying to get the model(**data) # for IPHost and IPInterface - if type(attr.value) != type(str) and attr.value: + if attr.value and not isinstance(attr.value, str): data[attr_name] = str(attr.value) else: data[attr_name] = attr.value diff --git a/sync/infrahub-sync/infrahub_sync/adapters/netbox.py b/sync/infrahub-sync/infrahub_sync/adapters/netbox.py index 1f54c27903..55416d6ca1 100644 --- a/sync/infrahub-sync/infrahub_sync/adapters/netbox.py +++ b/sync/infrahub-sync/infrahub_sync/adapters/netbox.py @@ -86,7 +86,8 @@ def netbox_obj_to_diffsync(self, obj: NetboxRecord, mapping: SchemaMappingModel, nodes = [item for item in self.store.get_all(model=field.reference)] if not nodes: raise IndexError( - f"Unable to get '{field.mapping}' with '{field.reference}' reference from store. The available models are {self.store.get_all_model_names()}" + f"Unable to get '{field.mapping}' with '{field.reference}' reference from store." + f" The available models are {self.store.get_all_model_names()}" ) if not field_is_list: if node := get_value(obj, field.mapping): diff --git a/sync/infrahub-sync/infrahub_sync/cli.py b/sync/infrahub-sync/infrahub_sync/cli.py index 2b390ce68a..bfdc89dc69 100644 --- a/sync/infrahub-sync/infrahub_sync/cli.py +++ b/sync/infrahub-sync/infrahub_sync/cli.py @@ -8,11 +8,12 @@ import typer import yaml from infrahub_sdk import InfrahubClientSync -from infrahub_sync import SyncAdapter, SyncConfig, SyncInstance -from infrahub_sync.generator import render_template from potenda import Potenda from rich.console import Console +from infrahub_sync import SyncAdapter, SyncConfig, SyncInstance +from infrahub_sync.generator import render_template + app = typer.Typer() console = Console() diff --git a/sync/infrahub-sync/infrahub_sync/generator/__init__.py b/sync/infrahub-sync/infrahub_sync/generator/__init__.py index 4cb680516c..e6912c8eef 100644 --- a/sync/infrahub-sync/infrahub_sync/generator/__init__.py +++ b/sync/infrahub-sync/infrahub_sync/generator/__init__.py @@ -9,6 +9,7 @@ RelationshipKind, RelationshipSchema, ) + from infrahub_sync import SyncConfig from infrahub_sync.generator.utils import list_to_set, list_to_str diff --git a/utilities/db_compare_query.py b/utilities/db_compare_query.py index 786d736f83..a30336e653 100644 --- a/utilities/db_compare_query.py +++ b/utilities/db_compare_query.py @@ -73,7 +73,7 @@ async def compare_query(): response_time2.append(time.time() - time_start) def time_to_ms(input): - return f"{int(input*1000)}ms" + return f"{int(input * 1000)}ms" print("-----------------------------------------") avg1 = sum(response_time1) / len(response_time1) @@ -81,6 +81,6 @@ def time_to_ms(input): print(f" Query 1 {time_to_ms(avg1)} | {[time_to_ms(x) for x in response_time1]}") print(f" Query 2 {time_to_ms(avg2)} | {[time_to_ms(x) for x in response_time2]}") if avg1 < avg2: - print(f" Query 1 is faster by {time_to_ms(avg2-avg1)} {int((avg1/avg2)*100)}%") + print(f" Query 1 is faster by {time_to_ms(avg2 - avg1)} {int((avg1 / avg2) * 100)}%") else: - print(f" Query 2 is faster by {time_to_ms(avg1-avg2)} {int((avg2/avg1)*100)}%") + print(f" Query 2 is faster by {time_to_ms(avg1 - avg2)} {int((avg2 / avg1) * 100)}%") From 6ef36d65f075cff2b8cd3913aa3b67ca2158eb97 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 11:16:58 +0100 Subject: [PATCH 009/446] missing misc files --- .dockerignore | 1 + .gitignore | 2 + .pre-commit-config.yaml | 4 -- poetry.lock | 104 ++++++++-------------------------------- 4 files changed, 24 insertions(+), 87 deletions(-) diff --git a/.dockerignore b/.dockerignore index e6b13d4587..fe17421ddd 100644 --- a/.dockerignore +++ b/.dockerignore @@ -4,6 +4,7 @@ script.py node_modules .venv .ruff_cache +**/.ruff_cache .mypy_cache .pytest_cache *.env diff --git a/.gitignore b/.gitignore index 9f77e863ab..8c367a87dd 100644 --- a/.gitignore +++ b/.gitignore @@ -9,6 +9,8 @@ development/docker-compose.override.yml development/docker-compose.dev-override.yml .DS_Store .python-version +.ruff_cache +**/.ruff_cache # Direnv files (https://direnv.net/) .direnv/ diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index ff6ae90dce..230d4662c3 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -10,7 +10,3 @@ repos: - id: check-toml - id: check-yaml - id: end-of-file-fixer - - repo: https://github.com/pycqa/isort - rev: 5.10.1 - hooks: - - id: isort diff --git a/poetry.lock b/poetry.lock index 6b78579021..a3fa8eadf7 100644 --- a/poetry.lock +++ b/poetry.lock @@ -263,48 +263,6 @@ files = [ tests = ["pytest (>=3.2.1,!=3.3.0)"] typecheck = ["mypy"] -[[package]] -name = "black" -version = "23.11.0" -description = "The uncompromising code formatter." -optional = false -python-versions = ">=3.8" -files = [ - {file = "black-23.11.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:dbea0bb8575c6b6303cc65017b46351dc5953eea5c0a59d7b7e3a2d2f433a911"}, - {file = "black-23.11.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:412f56bab20ac85927f3a959230331de5614aecda1ede14b373083f62ec24e6f"}, - {file = "black-23.11.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d136ef5b418c81660ad847efe0e55c58c8208b77a57a28a503a5f345ccf01394"}, - {file = "black-23.11.0-cp310-cp310-win_amd64.whl", hash = "sha256:6c1cac07e64433f646a9a838cdc00c9768b3c362805afc3fce341af0e6a9ae9f"}, - {file = "black-23.11.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cf57719e581cfd48c4efe28543fea3d139c6b6f1238b3f0102a9c73992cbb479"}, - {file = "black-23.11.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:698c1e0d5c43354ec5d6f4d914d0d553a9ada56c85415700b81dc90125aac244"}, - {file = "black-23.11.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:760415ccc20f9e8747084169110ef75d545f3b0932ee21368f63ac0fee86b221"}, - {file = "black-23.11.0-cp311-cp311-win_amd64.whl", hash = "sha256:58e5f4d08a205b11800332920e285bd25e1a75c54953e05502052738fe16b3b5"}, - {file = "black-23.11.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:45aa1d4675964946e53ab81aeec7a37613c1cb71647b5394779e6efb79d6d187"}, - {file = "black-23.11.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4c44b7211a3a0570cc097e81135faa5f261264f4dfaa22bd5ee2875a4e773bd6"}, - {file = "black-23.11.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2a9acad1451632021ee0d146c8765782a0c3846e0e0ea46659d7c4f89d9b212b"}, - {file = "black-23.11.0-cp38-cp38-win_amd64.whl", hash = "sha256:fc7f6a44d52747e65a02558e1d807c82df1d66ffa80a601862040a43ec2e3142"}, - {file = "black-23.11.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:7f622b6822f02bfaf2a5cd31fdb7cd86fcf33dab6ced5185c35f5db98260b055"}, - {file = "black-23.11.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:250d7e60f323fcfc8ea6c800d5eba12f7967400eb6c2d21ae85ad31c204fb1f4"}, - {file = "black-23.11.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5133f5507007ba08d8b7b263c7aa0f931af5ba88a29beacc4b2dc23fcefe9c06"}, - {file = "black-23.11.0-cp39-cp39-win_amd64.whl", hash = "sha256:421f3e44aa67138ab1b9bfbc22ee3780b22fa5b291e4db8ab7eee95200726b07"}, - {file = "black-23.11.0-py3-none-any.whl", hash = "sha256:54caaa703227c6e0c87b76326d0862184729a69b73d3b7305b6288e1d830067e"}, - {file = "black-23.11.0.tar.gz", hash = "sha256:4c68855825ff432d197229846f971bc4d6666ce90492e5b02013bcaca4d9ab05"}, -] - -[package.dependencies] -click = ">=8.0.0" -mypy-extensions = ">=0.4.3" -packaging = ">=22.0" -pathspec = ">=0.9.0" -platformdirs = ">=2" -tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""} -typing-extensions = {version = ">=4.0.1", markers = "python_version < \"3.11\""} - -[package.extras] -colorama = ["colorama (>=0.4.3)"] -d = ["aiohttp (>=3.7.4)"] -jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"] -uvloop = ["uvloop (>=0.15.2)"] - [[package]] name = "blinker" version = "1.7.0" @@ -1928,7 +1886,7 @@ testing = ["pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", [[package]] name = "infrahub-sdk" -version = "0.2.0" +version = "0.2.1" description = "Python Client to interact with Infrahub" optional = false python-versions = "^3.8" @@ -2177,16 +2135,6 @@ files = [ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"}, @@ -3333,7 +3281,6 @@ files = [ {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"}, - {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"}, {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"}, {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"}, {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"}, @@ -3341,15 +3288,8 @@ files = [ {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"}, - {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"}, {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"}, {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"}, - {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"}, - {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"}, - {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"}, - {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"}, - {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"}, - {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"}, {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"}, @@ -3366,7 +3306,6 @@ files = [ {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"}, - {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"}, {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"}, {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"}, {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"}, @@ -3374,7 +3313,6 @@ files = [ {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"}, - {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"}, {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"}, {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"}, {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"}, @@ -3570,28 +3508,28 @@ files = [ [[package]] name = "ruff" -version = "0.1.0" -description = "An extremely fast Python linter, written in Rust." +version = "0.1.5" +description = "An extremely fast Python linter and code formatter, written in Rust." optional = false python-versions = ">=3.7" files = [ - {file = "ruff-0.1.0-py3-none-macosx_10_7_x86_64.whl", hash = "sha256:87114e254dee35e069e1b922d85d4b21a5b61aec759849f393e1dbb308a00439"}, - {file = "ruff-0.1.0-py3-none-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:764f36d2982cc4a703e69fb73a280b7c539fd74b50c9ee531a4e3fe88152f521"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:65f4b7fb539e5cf0f71e9bd74f8ddab74cabdd673c6fb7f17a4dcfd29f126255"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:299fff467a0f163baa282266b310589b21400de0a42d8f68553422fa6bf7ee01"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0d412678bf205787263bb702c984012a4f97e460944c072fd7cfa2bd084857c4"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:a5391b49b1669b540924640587d8d24128e45be17d1a916b1801d6645e831581"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ee8cd57f454cdd77bbcf1e11ff4e0046fb6547cac1922cc6e3583ce4b9c326d1"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fa7aeed7bc23861a2b38319b636737bf11cfa55d2109620b49cf995663d3e888"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b04cd4298b43b16824d9a37800e4c145ba75c29c43ce0d74cad1d66d7ae0a4c5"}, - {file = "ruff-0.1.0-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:7186ccf54707801d91e6314a016d1c7895e21d2e4cd614500d55870ed983aa9f"}, - {file = "ruff-0.1.0-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:d88adfd93849bc62449518228581d132e2023e30ebd2da097f73059900d8dce3"}, - {file = "ruff-0.1.0-py3-none-musllinux_1_2_i686.whl", hash = "sha256:ad2ccdb3bad5a61013c76a9c1240fdfadf2c7103a2aeebd7bcbbed61f363138f"}, - {file = "ruff-0.1.0-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:b77f6cfa72c6eb19b5cac967cc49762ae14d036db033f7d97a72912770fd8e1c"}, - {file = "ruff-0.1.0-py3-none-win32.whl", hash = "sha256:480bd704e8af1afe3fd444cc52e3c900b936e6ca0baf4fb0281124330b6ceba2"}, - {file = "ruff-0.1.0-py3-none-win_amd64.whl", hash = "sha256:a76ba81860f7ee1f2d5651983f87beb835def94425022dc5f0803108f1b8bfa2"}, - {file = "ruff-0.1.0-py3-none-win_arm64.whl", hash = "sha256:45abdbdab22509a2c6052ecf7050b3f5c7d6b7898dc07e82869401b531d46da4"}, - {file = "ruff-0.1.0.tar.gz", hash = "sha256:ad6b13824714b19c5f8225871cf532afb994470eecb74631cd3500fe817e6b3f"}, + {file = "ruff-0.1.5-py3-none-macosx_10_7_x86_64.whl", hash = "sha256:32d47fc69261c21a4c48916f16ca272bf2f273eb635d91c65d5cd548bf1f3d96"}, + {file = "ruff-0.1.5-py3-none-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:171276c1df6c07fa0597fb946139ced1c2978f4f0b8254f201281729981f3c17"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:17ef33cd0bb7316ca65649fc748acc1406dfa4da96a3d0cde6d52f2e866c7b39"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b2c205827b3f8c13b4a432e9585750b93fd907986fe1aec62b2a02cf4401eee6"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bb408e3a2ad8f6881d0f2e7ad70cddb3ed9f200eb3517a91a245bbe27101d379"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:f20dc5e5905ddb407060ca27267c7174f532375c08076d1a953cf7bb016f5a24"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aafb9d2b671ed934998e881e2c0f5845a4295e84e719359c71c39a5363cccc91"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a4894dddb476597a0ba4473d72a23151b8b3b0b5f958f2cf4d3f1c572cdb7af7"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a00a7ec893f665ed60008c70fe9eeb58d210e6b4d83ec6654a9904871f982a2a"}, + {file = "ruff-0.1.5-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:a8c11206b47f283cbda399a654fd0178d7a389e631f19f51da15cbe631480c5b"}, + {file = "ruff-0.1.5-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:fa29e67b3284b9a79b1a85ee66e293a94ac6b7bb068b307a8a373c3d343aa8ec"}, + {file = "ruff-0.1.5-py3-none-musllinux_1_2_i686.whl", hash = "sha256:9b97fd6da44d6cceb188147b68db69a5741fbc736465b5cea3928fdac0bc1aeb"}, + {file = "ruff-0.1.5-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:721f4b9d3b4161df8dc9f09aa8562e39d14e55a4dbaa451a8e55bdc9590e20f4"}, + {file = "ruff-0.1.5-py3-none-win32.whl", hash = "sha256:f80c73bba6bc69e4fdc73b3991db0b546ce641bdcd5b07210b8ad6f64c79f1ab"}, + {file = "ruff-0.1.5-py3-none-win_amd64.whl", hash = "sha256:c21fe20ee7d76206d290a76271c1af7a5096bc4c73ab9383ed2ad35f852a0087"}, + {file = "ruff-0.1.5-py3-none-win_arm64.whl", hash = "sha256:82bfcb9927e88c1ed50f49ac6c9728dab3ea451212693fe40d08d314663e412f"}, + {file = "ruff-0.1.5.tar.gz", hash = "sha256:5cbec0ef2ae1748fb194f420fb03fb2c25c3258c86129af7172ff8f198f125ab"}, ] [[package]] @@ -4700,4 +4638,4 @@ testing = ["coverage (>=5.0.3)", "zope.event", "zope.testing"] [metadata] lock-version = "2.0" python-versions = "^3.8, < 3.12" -content-hash = "47172bdc08306a34ab0368d124995c0c45c0240a80304137425ad52868e0c20d" +content-hash = "cb7d601e824197389e1d89421aa4930b690959caacc279fb1d123f6d86be0c7d" From e2a5c578acda829dbd372a0f69a0bd349b081642 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 11:33:00 +0100 Subject: [PATCH 010/446] add back pylint comment after split of the line --- backend/infrahub/graphql/__init__.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/backend/infrahub/graphql/__init__.py b/backend/infrahub/graphql/__init__.py index e6ef406904..4b95237aa4 100644 --- a/backend/infrahub/graphql/__init__.py +++ b/backend/infrahub/graphql/__init__.py @@ -54,7 +54,7 @@ class Mutation(InfrahubBaseMutation, MutationMixin): async def get_gql_subscription( - db: InfrahubDatabase, + db: InfrahubDatabase, # pylint: disable=unused-argument branch: Union[Branch, str] = None, # pylint: disable=unused-argument ) -> type[InfrahubBaseSubscription]: class Subscription(InfrahubBaseSubscription): From 0424a3a15ddbb6ee0b3d0a711615033939fa5a43 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 11:37:51 +0100 Subject: [PATCH 011/446] force config for ruf check --- .github/workflows/ci.yml | 2 +- backend/infrahub/graphql/__init__.py | 2 +- python_sdk/pyproject.toml | 74 ++++++++++++++++++---------- tasks/backend.py | 2 +- tasks/ctl.py | 2 +- tasks/sdk.py | 2 +- tasks/sync.py | 2 +- 7 files changed, 55 insertions(+), 31 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index a5f507518c..8b1f027ed3 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -107,7 +107,7 @@ jobs: - name: "Setup environment" run: "pip install ruff==0.1.5" - name: "Linting: ruff" - run: "ruff check . --fix" + run: "ruff check . --fix --config pyproject.toml" # ------------------------------------------ Build Docker Image ------------------------------------------ # backend-build-docker: diff --git a/backend/infrahub/graphql/__init__.py b/backend/infrahub/graphql/__init__.py index 4b95237aa4..7af0d31397 100644 --- a/backend/infrahub/graphql/__init__.py +++ b/backend/infrahub/graphql/__init__.py @@ -54,7 +54,7 @@ class Mutation(InfrahubBaseMutation, MutationMixin): async def get_gql_subscription( - db: InfrahubDatabase, # pylint: disable=unused-argument + db: InfrahubDatabase, # pylint: disable=unused-argument branch: Union[Branch, str] = None, # pylint: disable=unused-argument ) -> type[InfrahubBaseSubscription]: class Subscription(InfrahubBaseSubscription): diff --git a/python_sdk/pyproject.toml b/python_sdk/pyproject.toml index 6d9cd70676..a2bf834fd6 100644 --- a/python_sdk/pyproject.toml +++ b/python_sdk/pyproject.toml @@ -37,7 +37,6 @@ pyyaml = "^6.0" gitpython = "3.1.40" [tool.poetry.group.dev.dependencies] -black = "*" pytest = "*" yamllint = "*" pylint = "*" @@ -46,7 +45,6 @@ ipython = "*" pytest-asyncio = "*" requests = "*" pre-commit = "^2.20.0" -isort = "*" autoflake = "*" pytest-clarity = "^1.0.1" pytest-httpx = "^0.22" @@ -64,26 +62,6 @@ types-python-slugify = "^8.0.0.3" infrahub = "infrahub.cli:app" infrahubctl = "infrahub_ctl.cli:app" -[tool.black] -line-length = 120 -include = '\.pyi?$' -exclude = ''' - /( - \.git - | \.tox - | \.venv - | env/ - | _build - | build - | dist - | examples - )/ - ''' - -[tool.isort] -profile = "black" -known_first_party = [ "infrahub_sdk", "infrahub_ctl" ] - [tool.coverage.run] branch = true @@ -160,20 +138,65 @@ ignore_errors = true [tool.ruff] +line-length = 120 + +exclude = [ + ".git", + ".tox", + ".venv", + "env", + "_build", + "build", + "dist", + "examples", +] + +task-tags = [ + "FIXME", + "TODO", + "XXX", +] + +[tool.ruff.lint] +preview = true + select = [ + # mccabe complexity "C90", - "DTZ", + # pycodestyle errors "E", + # pycodestyle warnings + "W", + # pyflakes "F", + # isort-like checks + "I", + # flake8-datetimez + "DTZ", + # flake8-import-conventions "ICN", + # flake8-type-checking "TCH", + # flake8-debugger "T10", + # flake8-quotes "Q", - "W", + # flake8-2020 "YTT", ] -line-length = 170 +#https://docs.astral.sh/ruff/formatter/black/ +[tool.ruff.format] +quote-style = "double" +indent-style = "space" +skip-magic-trailing-comma = false +line-ending = "auto" + +[tool.ruff.lint.isort] +known-first-party = ["infrahub_sdk"] + +[tool.ruff.lint.pycodestyle] +max-line-length = 150 [tool.ruff.mccabe] # Target max-complexity=10 @@ -181,6 +204,7 @@ max-complexity = 31 [tool.ruff.per-file-ignores] + [build-system] requires = ["poetry-core"] build-backend = "poetry.core.masonry.api" \ No newline at end of file diff --git a/tasks/backend.py b/tasks/backend.py index 3b11788112..4316304a61 100644 --- a/tasks/backend.py +++ b/tasks/backend.py @@ -76,7 +76,7 @@ def ruff(context: Context, docker: bool = False): """Run ruff to check that Python files adherence to black standards.""" print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix" + exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix --config {REPO_BASE}/pyproject.toml" if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) diff --git a/tasks/ctl.py b/tasks/ctl.py index b4c0301bb5..f017d5a7bb 100644 --- a/tasks/ctl.py +++ b/tasks/ctl.py @@ -75,7 +75,7 @@ def ruff(context: Context, docker: bool = False): """Run ruff to check that Python files adherence to standards.""" print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix" + exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix --config {REPO_BASE}/pyproject.toml" if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) diff --git a/tasks/sdk.py b/tasks/sdk.py index bf00b4e267..3c90a0eadb 100644 --- a/tasks/sdk.py +++ b/tasks/sdk.py @@ -59,7 +59,7 @@ def ruff(context: Context, docker: bool = False): """Run ruff to check that Python files adherence to black standards.""" print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = "ruff check . --fix" + exec_cmd = "ruff check . --fix --config {REPO_BASE}/pyproject.toml" exec_directory = MAIN_DIRECTORY_PATH if docker: diff --git a/tasks/sync.py b/tasks/sync.py index 3e695d94e5..957253a59f 100644 --- a/tasks/sync.py +++ b/tasks/sync.py @@ -55,7 +55,7 @@ def ruff(context: Context, docker: bool = False): """Run ruff to check that Python files adherence to black standards.""" print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix" + exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix --config {REPO_BASE}/pyproject.toml" if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) From be63256693b9f7e19a5c1715e2c132c20cc74713 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 11:39:10 +0100 Subject: [PATCH 012/446] run ruff check with config file --- python_sdk/infrahub_ctl/branch.py | 2 +- python_sdk/infrahub_ctl/cli.py | 4 ++-- python_sdk/infrahub_ctl/client.py | 3 ++- python_sdk/infrahub_ctl/schema.py | 2 +- python_sdk/infrahub_ctl/validate.py | 2 +- python_sdk/infrahub_sdk/graphql.py | 6 +++--- python_sdk/infrahub_sdk/node.py | 3 ++- python_sdk/tests/conftest.py | 1 - python_sdk/tests/integration/conftest.py | 10 +++++----- python_sdk/tests/integration/test_infrahub_client.py | 8 ++++---- .../tests/integration/test_infrahub_client_sync.py | 8 ++++---- python_sdk/tests/integration/test_node.py | 8 ++++---- python_sdk/tests/integration/test_object_store.py | 1 - python_sdk/tests/integration/test_schema.py | 4 ++-- python_sdk/tests/unit/ctl/test_branch_app.py | 3 +-- python_sdk/tests/unit/ctl/test_cli.py | 3 +-- python_sdk/tests/unit/ctl/test_schema_app.py | 5 ++--- python_sdk/tests/unit/ctl/test_validate_app.py | 3 +-- python_sdk/tests/unit/sdk/conftest.py | 3 +-- python_sdk/tests/unit/sdk/test_branch.py | 1 - python_sdk/tests/unit/sdk/test_client.py | 3 +-- python_sdk/tests/unit/sdk/test_config.py | 3 +-- python_sdk/tests/unit/sdk/test_graphql.py | 1 - python_sdk/tests/unit/sdk/test_node.py | 3 +-- python_sdk/tests/unit/sdk/test_object_store.py | 3 +-- python_sdk/tests/unit/sdk/test_schema.py | 1 - python_sdk/tests/unit/sdk/test_store.py | 1 - python_sdk/tests/unit/sdk/test_timestamp.py | 1 - python_sdk/tests/unit/sdk/test_utils.py | 1 - 29 files changed, 41 insertions(+), 56 deletions(-) diff --git a/python_sdk/infrahub_ctl/branch.py b/python_sdk/infrahub_ctl/branch.py index e1de2119c9..839d1c65b3 100644 --- a/python_sdk/infrahub_ctl/branch.py +++ b/python_sdk/infrahub_ctl/branch.py @@ -6,6 +6,7 @@ from typing import Dict, Generator, List, Optional, Union import typer +from infrahub_sdk import Error, GraphQLError from rich.console import Console from rich.console import group as rich_group from rich.panel import Panel @@ -18,7 +19,6 @@ print_graphql_errors, render_action_rich, ) -from infrahub_sdk import Error, GraphQLError app = typer.Typer() diff --git a/python_sdk/infrahub_ctl/cli.py b/python_sdk/infrahub_ctl/cli.py index e88aa5bf10..32f3f74fb5 100644 --- a/python_sdk/infrahub_ctl/cli.py +++ b/python_sdk/infrahub_ctl/cli.py @@ -9,6 +9,8 @@ import jinja2 import typer +from infrahub_sdk.exceptions import GraphQLError +from infrahub_sdk.schema import InfrahubRepositoryConfig from pydantic import ValidationError from rich.console import Console from rich.logging import RichHandler @@ -29,8 +31,6 @@ parse_cli_vars, ) from infrahub_ctl.validate import app as validate_app -from infrahub_sdk.exceptions import GraphQLError -from infrahub_sdk.schema import InfrahubRepositoryConfig app = typer.Typer(pretty_exceptions_show_locals=False) diff --git a/python_sdk/infrahub_ctl/client.py b/python_sdk/infrahub_ctl/client.py index bb43a1c00a..0d6c0ed89f 100644 --- a/python_sdk/infrahub_ctl/client.py +++ b/python_sdk/infrahub_ctl/client.py @@ -1,9 +1,10 @@ from typing import Any -import infrahub_ctl.config as config from infrahub_sdk import InfrahubClient, InfrahubClientSync from infrahub_sdk.config import Config +import infrahub_ctl.config as config + async def initialize_client(**kwargs: Any) -> InfrahubClient: client_config = {} diff --git a/python_sdk/infrahub_ctl/schema.py b/python_sdk/infrahub_ctl/schema.py index f031f79c0b..9d7d6ac12e 100644 --- a/python_sdk/infrahub_ctl/schema.py +++ b/python_sdk/infrahub_ctl/schema.py @@ -5,13 +5,13 @@ import typer import yaml +from infrahub_sdk.utils import find_files from pydantic import BaseModel, ValidationError from rich.console import Console from rich.logging import RichHandler import infrahub_ctl.config as config from infrahub_ctl.client import initialize_client -from infrahub_sdk.utils import find_files app = typer.Typer() diff --git a/python_sdk/infrahub_ctl/validate.py b/python_sdk/infrahub_ctl/validate.py index 7f6094be5b..b528d4483d 100644 --- a/python_sdk/infrahub_ctl/validate.py +++ b/python_sdk/infrahub_ctl/validate.py @@ -5,6 +5,7 @@ import typer import yaml +from infrahub_sdk.exceptions import GraphQLError from pydantic import ValidationError from rich.console import Console from ujson import JSONDecodeError @@ -13,7 +14,6 @@ from infrahub_ctl.client import initialize_client, initialize_client_sync from infrahub_ctl.exceptions import QueryNotFoundError from infrahub_ctl.utils import find_graphql_query, get_branch, parse_cli_vars -from infrahub_sdk.exceptions import GraphQLError app = typer.Typer() diff --git a/python_sdk/infrahub_sdk/graphql.py b/python_sdk/infrahub_sdk/graphql.py index 87748decec..72db6f1b0b 100644 --- a/python_sdk/infrahub_sdk/graphql.py +++ b/python_sdk/infrahub_sdk/graphql.py @@ -84,7 +84,7 @@ def render_input_block(data: dict, offset: int = 4, indentation: int = 4) -> Lis lines.append(f"{offset_str}{key}: " + "[") for item in value: if isinstance(item, dict): - lines.append(f"{offset_str}{' '*indentation}" + "{") + lines.append(f"{offset_str}{' ' * indentation}" + "{") lines.extend( render_input_block( data=item, @@ -92,9 +92,9 @@ def render_input_block(data: dict, offset: int = 4, indentation: int = 4) -> Lis indentation=indentation, ) ) - lines.append(f"{offset_str}{' '*indentation}" + "},") + lines.append(f"{offset_str}{' ' * indentation}" + "},") else: - lines.append(f"{offset_str}{' '*indentation}{convert_to_graphql_as_string(item)},") + lines.append(f"{offset_str}{' ' * indentation}{convert_to_graphql_as_string(item)},") lines.append(offset_str + "]") else: lines.append(f"{offset_str}{key}: {convert_to_graphql_as_string(value)}") diff --git a/python_sdk/infrahub_sdk/node.py b/python_sdk/infrahub_sdk/node.py index 95fa627f33..3ff6c71333 100644 --- a/python_sdk/infrahub_sdk/node.py +++ b/python_sdk/infrahub_sdk/node.py @@ -772,7 +772,8 @@ def generate_query_data_node( Args: include (Optional[List[str]], optional): List of attributes or relationships to include. Defaults to None. exclude (Optional[List[str]], optional): List of attributes or relationships to exclude. Defaults to None. - inherited (bool, optional): Indicated of the attributes and the relationships inherited from generics should be included as well. Defaults to True. + inherited (bool, optional): Indicated of the attributes and the relationships inherited from generics should be included as well. + Defaults to True. Returns: Dict[str, Union[Any, Dict]]: Query in Dict format diff --git a/python_sdk/tests/conftest.py b/python_sdk/tests/conftest.py index 66217e5971..afd1519832 100644 --- a/python_sdk/tests/conftest.py +++ b/python_sdk/tests/conftest.py @@ -1,7 +1,6 @@ import asyncio import pytest - from infrahub_ctl import config diff --git a/python_sdk/tests/integration/conftest.py b/python_sdk/tests/integration/conftest.py index a384d2f218..067ecdb1ac 100644 --- a/python_sdk/tests/integration/conftest.py +++ b/python_sdk/tests/integration/conftest.py @@ -4,19 +4,19 @@ from typing import Any, Dict, Optional import httpx -import infrahub.config as config import pytest from fastapi.testclient import TestClient +from infrahub_sdk.schema import NodeSchema +from infrahub_sdk.types import HTTPMethod +from infrahub_sdk.utils import str_to_bool + +import infrahub.config as config from infrahub.core.initialization import first_time_initialization, initialization from infrahub.core.node import Node from infrahub.core.utils import delete_all_nodes from infrahub.database import InfrahubDatabase, get_db from infrahub.lock import initialize_lock -from infrahub_sdk.schema import NodeSchema -from infrahub_sdk.types import HTTPMethod -from infrahub_sdk.utils import str_to_bool - BUILD_NAME = os.environ.get("INFRAHUB_BUILD_NAME", "infrahub") TEST_IN_DOCKER = str_to_bool(os.environ.get("INFRAHUB_TEST_IN_DOCKER", "false")) diff --git a/python_sdk/tests/integration/test_infrahub_client.py b/python_sdk/tests/integration/test_infrahub_client.py index 9977c8f344..15589262d5 100644 --- a/python_sdk/tests/integration/test_infrahub_client.py +++ b/python_sdk/tests/integration/test_infrahub_client.py @@ -1,13 +1,13 @@ import pytest +from infrahub_sdk import Config, InfrahubClient +from infrahub_sdk.exceptions import BranchNotFound +from infrahub_sdk.node import InfrahubNode + from infrahub.core import registry from infrahub.core.initialization import create_branch from infrahub.core.node import Node from infrahub.database import InfrahubDatabase -from infrahub_sdk import Config, InfrahubClient -from infrahub_sdk.exceptions import BranchNotFound -from infrahub_sdk.node import InfrahubNode - from .conftest import InfrahubTestClient # pylint: disable=unused-argument diff --git a/python_sdk/tests/integration/test_infrahub_client_sync.py b/python_sdk/tests/integration/test_infrahub_client_sync.py index abcf4fd2e1..939c5f998a 100644 --- a/python_sdk/tests/integration/test_infrahub_client_sync.py +++ b/python_sdk/tests/integration/test_infrahub_client_sync.py @@ -1,12 +1,12 @@ import pytest -from infrahub.core.initialization import create_branch -from infrahub.core.node import Node -from infrahub.database import InfrahubDatabase - from infrahub_sdk import Config, InfrahubClientSync from infrahub_sdk.exceptions import BranchNotFound from infrahub_sdk.node import InfrahubNodeSync +from infrahub.core.initialization import create_branch +from infrahub.core.node import Node +from infrahub.database import InfrahubDatabase + from .conftest import InfrahubTestClient # pylint: disable=unused-argument diff --git a/python_sdk/tests/integration/test_node.py b/python_sdk/tests/integration/test_node.py index c85e7c0bdf..8aa7240520 100644 --- a/python_sdk/tests/integration/test_node.py +++ b/python_sdk/tests/integration/test_node.py @@ -1,12 +1,12 @@ import pytest -from infrahub.core.manager import NodeManager -from infrahub.core.node import Node -from infrahub.database import InfrahubDatabase - from infrahub_sdk import Config, InfrahubClient from infrahub_sdk.exceptions import NodeNotFound from infrahub_sdk.node import InfrahubNode +from infrahub.core.manager import NodeManager +from infrahub.core.node import Node +from infrahub.database import InfrahubDatabase + from .conftest import InfrahubTestClient # pylint: disable=unused-argument diff --git a/python_sdk/tests/integration/test_object_store.py b/python_sdk/tests/integration/test_object_store.py index 81905d5a4e..0470c9b667 100644 --- a/python_sdk/tests/integration/test_object_store.py +++ b/python_sdk/tests/integration/test_object_store.py @@ -1,5 +1,4 @@ import pytest - from infrahub_sdk import Config, InfrahubClient from .conftest import InfrahubTestClient diff --git a/python_sdk/tests/integration/test_schema.py b/python_sdk/tests/integration/test_schema.py index e6712002a0..c42894e47d 100644 --- a/python_sdk/tests/integration/test_schema.py +++ b/python_sdk/tests/integration/test_schema.py @@ -1,9 +1,9 @@ import pytest -from infrahub.core.schema import core_models - from infrahub_sdk import Config, InfrahubClient from infrahub_sdk.schema import NodeSchema +from infrahub.core.schema import core_models + from .conftest import InfrahubTestClient # pylint: disable=unused-argument diff --git a/python_sdk/tests/unit/ctl/test_branch_app.py b/python_sdk/tests/unit/ctl/test_branch_app.py index 40160cf5e0..149846a257 100644 --- a/python_sdk/tests/unit/ctl/test_branch_app.py +++ b/python_sdk/tests/unit/ctl/test_branch_app.py @@ -1,8 +1,7 @@ +from infrahub_ctl.branch import app from pytest_httpx import HTTPXMock from typer.testing import CliRunner -from infrahub_ctl.branch import app - runner = CliRunner() # pylint: disable=unused-argument diff --git a/python_sdk/tests/unit/ctl/test_cli.py b/python_sdk/tests/unit/ctl/test_cli.py index 84f77bbc72..6edf5043bb 100644 --- a/python_sdk/tests/unit/ctl/test_cli.py +++ b/python_sdk/tests/unit/ctl/test_cli.py @@ -1,6 +1,5 @@ -from typer.testing import CliRunner - from infrahub_ctl.cli import app +from typer.testing import CliRunner runner = CliRunner() diff --git a/python_sdk/tests/unit/ctl/test_schema_app.py b/python_sdk/tests/unit/ctl/test_schema_app.py index ed5bf10292..47309c29b8 100644 --- a/python_sdk/tests/unit/ctl/test_schema_app.py +++ b/python_sdk/tests/unit/ctl/test_schema_app.py @@ -3,11 +3,10 @@ import pytest import yaml -from pytest_httpx import HTTPXMock -from typer.testing import CliRunner - from infrahub_ctl.schema import app from infrahub_ctl.utils import get_fixtures_dir +from pytest_httpx import HTTPXMock +from typer.testing import CliRunner runner = CliRunner() diff --git a/python_sdk/tests/unit/ctl/test_validate_app.py b/python_sdk/tests/unit/ctl/test_validate_app.py index fece5def00..0e5c894364 100644 --- a/python_sdk/tests/unit/ctl/test_validate_app.py +++ b/python_sdk/tests/unit/ctl/test_validate_app.py @@ -1,10 +1,9 @@ import os import pytest -from typer.testing import CliRunner - from infrahub_ctl.utils import get_fixtures_dir from infrahub_ctl.validate import app +from typer.testing import CliRunner runner = CliRunner() diff --git a/python_sdk/tests/unit/sdk/conftest.py b/python_sdk/tests/unit/sdk/conftest.py index a95981f471..9ed3fc776a 100644 --- a/python_sdk/tests/unit/sdk/conftest.py +++ b/python_sdk/tests/unit/sdk/conftest.py @@ -4,11 +4,10 @@ import pytest import ujson -from pytest_httpx import HTTPXMock - from infrahub_sdk import InfrahubClient, InfrahubClientSync from infrahub_sdk.schema import BranchSupportType, NodeSchema from infrahub_sdk.utils import get_fixtures_dir +from pytest_httpx import HTTPXMock # pylint: disable=redefined-outer-name,unused-argument diff --git a/python_sdk/tests/unit/sdk/test_branch.py b/python_sdk/tests/unit/sdk/test_branch.py index df08a5f99e..e06542cebe 100644 --- a/python_sdk/tests/unit/sdk/test_branch.py +++ b/python_sdk/tests/unit/sdk/test_branch.py @@ -1,7 +1,6 @@ import inspect import pytest - from infrahub_sdk.branch import ( BranchData, InfrahubBranchManager, diff --git a/python_sdk/tests/unit/sdk/test_client.py b/python_sdk/tests/unit/sdk/test_client.py index 5e7316f7eb..fad5884ba5 100644 --- a/python_sdk/tests/unit/sdk/test_client.py +++ b/python_sdk/tests/unit/sdk/test_client.py @@ -1,12 +1,11 @@ import inspect import pytest -from pytest_httpx import HTTPXMock - from infrahub_sdk import InfrahubClient, InfrahubClientSync from infrahub_sdk.data import RepositoryData from infrahub_sdk.exceptions import FilterNotFound, NodeNotFound from infrahub_sdk.node import InfrahubNode, InfrahubNodeSync +from pytest_httpx import HTTPXMock async_client_methods = [method for method in dir(InfrahubClient) if not method.startswith("_")] sync_client_methods = [method for method in dir(InfrahubClientSync) if not method.startswith("_")] diff --git a/python_sdk/tests/unit/sdk/test_config.py b/python_sdk/tests/unit/sdk/test_config.py index 469954e7e8..fbf266b475 100644 --- a/python_sdk/tests/unit/sdk/test_config.py +++ b/python_sdk/tests/unit/sdk/test_config.py @@ -1,7 +1,6 @@ import pytest -from pydantic.error_wrappers import ValidationError - from infrahub_sdk.config import Config +from pydantic.error_wrappers import ValidationError def test_combine_authentications(): diff --git a/python_sdk/tests/unit/sdk/test_graphql.py b/python_sdk/tests/unit/sdk/test_graphql.py index b9e88b055c..6ae52d8cd2 100644 --- a/python_sdk/tests/unit/sdk/test_graphql.py +++ b/python_sdk/tests/unit/sdk/test_graphql.py @@ -1,5 +1,4 @@ import pytest - from infrahub_sdk.graphql import Mutation, Query, render_input_block, render_query_block # pylint: disable=redefined-outer-name diff --git a/python_sdk/tests/unit/sdk/test_node.py b/python_sdk/tests/unit/sdk/test_node.py index 7f1e5e1976..1cb61b35db 100644 --- a/python_sdk/tests/unit/sdk/test_node.py +++ b/python_sdk/tests/unit/sdk/test_node.py @@ -3,8 +3,6 @@ from typing import TYPE_CHECKING import pytest -from pytest_httpx import HTTPXMock - from infrahub_sdk.exceptions import NodeNotFound from infrahub_sdk.node import ( SAFE_VALUE, @@ -14,6 +12,7 @@ RelatedNodeBase, RelationshipManagerBase, ) +from pytest_httpx import HTTPXMock if TYPE_CHECKING: from infrahub_sdk.client import InfrahubClient, InfrahubClientSync diff --git a/python_sdk/tests/unit/sdk/test_object_store.py b/python_sdk/tests/unit/sdk/test_object_store.py index 8da5d46039..72733ec816 100644 --- a/python_sdk/tests/unit/sdk/test_object_store.py +++ b/python_sdk/tests/unit/sdk/test_object_store.py @@ -1,9 +1,8 @@ import inspect import pytest -from pytest_httpx import HTTPXMock - from infrahub_sdk.object_store import ObjectStore, ObjectStoreSync +from pytest_httpx import HTTPXMock # pylint: disable=redefined-outer-name,unused-argument diff --git a/python_sdk/tests/unit/sdk/test_schema.py b/python_sdk/tests/unit/sdk/test_schema.py index 70c3ebda09..497140e058 100644 --- a/python_sdk/tests/unit/sdk/test_schema.py +++ b/python_sdk/tests/unit/sdk/test_schema.py @@ -1,7 +1,6 @@ import inspect import pytest - from infrahub_sdk import InfrahubClient, InfrahubClientSync, ValidationError from infrahub_sdk.schema import InfrahubSchema, InfrahubSchemaSync, NodeSchema diff --git a/python_sdk/tests/unit/sdk/test_store.py b/python_sdk/tests/unit/sdk/test_store.py index ea298b39ef..795bb8983a 100644 --- a/python_sdk/tests/unit/sdk/test_store.py +++ b/python_sdk/tests/unit/sdk/test_store.py @@ -1,5 +1,4 @@ import pytest - from infrahub_sdk import InfrahubNode, NodeStore client_types = ["standard", "sync"] diff --git a/python_sdk/tests/unit/sdk/test_timestamp.py b/python_sdk/tests/unit/sdk/test_timestamp.py index 12c7aa0627..cde13a566a 100644 --- a/python_sdk/tests/unit/sdk/test_timestamp.py +++ b/python_sdk/tests/unit/sdk/test_timestamp.py @@ -1,6 +1,5 @@ import pendulum import pytest - from infrahub_sdk.timestamp import Timestamp diff --git a/python_sdk/tests/unit/sdk/test_utils.py b/python_sdk/tests/unit/sdk/test_utils.py index 3ff235d6b2..bef0140a43 100644 --- a/python_sdk/tests/unit/sdk/test_utils.py +++ b/python_sdk/tests/unit/sdk/test_utils.py @@ -1,7 +1,6 @@ import uuid import pytest - from infrahub_sdk.node import InfrahubNode from infrahub_sdk.utils import ( base16decode, From 0be178df6bc119c8472bbc0f6cab96db652fd460 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 11:44:22 +0100 Subject: [PATCH 013/446] update lock file for sdk --- python_sdk/poetry.lock | 64 +----------------------------------------- 1 file changed, 1 insertion(+), 63 deletions(-) diff --git a/python_sdk/poetry.lock b/python_sdk/poetry.lock index 0ea819e138..4096fb35de 100644 --- a/python_sdk/poetry.lock +++ b/python_sdk/poetry.lock @@ -108,48 +108,6 @@ files = [ {file = "backcall-0.2.0.tar.gz", hash = "sha256:5cbdbf27be5e7cfadb448baf0aa95508f91f2bbc6c6437cd9cd06e2a4c215e1e"}, ] -[[package]] -name = "black" -version = "23.10.1" -description = "The uncompromising code formatter." -optional = false -python-versions = ">=3.8" -files = [ - {file = "black-23.10.1-cp310-cp310-macosx_10_16_arm64.whl", hash = "sha256:ec3f8e6234c4e46ff9e16d9ae96f4ef69fa328bb4ad08198c8cee45bb1f08c69"}, - {file = "black-23.10.1-cp310-cp310-macosx_10_16_x86_64.whl", hash = "sha256:1b917a2aa020ca600483a7b340c165970b26e9029067f019e3755b56e8dd5916"}, - {file = "black-23.10.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c74de4c77b849e6359c6f01987e94873c707098322b91490d24296f66d067dc"}, - {file = "black-23.10.1-cp310-cp310-win_amd64.whl", hash = "sha256:7b4d10b0f016616a0d93d24a448100adf1699712fb7a4efd0e2c32bbb219b173"}, - {file = "black-23.10.1-cp311-cp311-macosx_10_16_arm64.whl", hash = "sha256:b15b75fc53a2fbcac8a87d3e20f69874d161beef13954747e053bca7a1ce53a0"}, - {file = "black-23.10.1-cp311-cp311-macosx_10_16_x86_64.whl", hash = "sha256:e293e4c2f4a992b980032bbd62df07c1bcff82d6964d6c9496f2cd726e246ace"}, - {file = "black-23.10.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7d56124b7a61d092cb52cce34182a5280e160e6aff3137172a68c2c2c4b76bcb"}, - {file = "black-23.10.1-cp311-cp311-win_amd64.whl", hash = "sha256:3f157a8945a7b2d424da3335f7ace89c14a3b0625e6593d21139c2d8214d55ce"}, - {file = "black-23.10.1-cp38-cp38-macosx_10_16_arm64.whl", hash = "sha256:cfcce6f0a384d0da692119f2d72d79ed07c7159879d0bb1bb32d2e443382bf3a"}, - {file = "black-23.10.1-cp38-cp38-macosx_10_16_x86_64.whl", hash = "sha256:33d40f5b06be80c1bbce17b173cda17994fbad096ce60eb22054da021bf933d1"}, - {file = "black-23.10.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:840015166dbdfbc47992871325799fd2dc0dcf9395e401ada6d88fe11498abad"}, - {file = "black-23.10.1-cp38-cp38-win_amd64.whl", hash = "sha256:037e9b4664cafda5f025a1728c50a9e9aedb99a759c89f760bd83730e76ba884"}, - {file = "black-23.10.1-cp39-cp39-macosx_10_16_arm64.whl", hash = "sha256:7cb5936e686e782fddb1c73f8aa6f459e1ad38a6a7b0e54b403f1f05a1507ee9"}, - {file = "black-23.10.1-cp39-cp39-macosx_10_16_x86_64.whl", hash = "sha256:7670242e90dc129c539e9ca17665e39a146a761e681805c54fbd86015c7c84f7"}, - {file = "black-23.10.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5ed45ac9a613fb52dad3b61c8dea2ec9510bf3108d4db88422bacc7d1ba1243d"}, - {file = "black-23.10.1-cp39-cp39-win_amd64.whl", hash = "sha256:6d23d7822140e3fef190734216cefb262521789367fbdc0b3f22af6744058982"}, - {file = "black-23.10.1-py3-none-any.whl", hash = "sha256:d431e6739f727bb2e0495df64a6c7a5310758e87505f5f8cde9ff6c0f2d7e4fe"}, - {file = "black-23.10.1.tar.gz", hash = "sha256:1f8ce316753428ff68749c65a5f7844631aa18c8679dfd3ca9dc1a289979c258"}, -] - -[package.dependencies] -click = ">=8.0.0" -mypy-extensions = ">=0.4.3" -packaging = ">=22.0" -pathspec = ">=0.9.0" -platformdirs = ">=2" -tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""} -typing-extensions = {version = ">=4.0.1", markers = "python_version < \"3.11\""} - -[package.extras] -colorama = ["colorama (>=0.4.3)"] -d = ["aiohttp (>=3.7.4)"] -jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"] -uvloop = ["uvloop (>=0.15.2)"] - [[package]] name = "buildkite-test-collector" version = "0.1.7" @@ -772,16 +730,6 @@ files = [ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"}, @@ -1384,7 +1332,6 @@ files = [ {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"}, - {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"}, {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"}, {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"}, {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"}, @@ -1392,15 +1339,8 @@ files = [ {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"}, - {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"}, {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"}, {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"}, - {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"}, - {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"}, - {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"}, - {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"}, - {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"}, - {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"}, {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"}, @@ -1417,7 +1357,6 @@ files = [ {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"}, - {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"}, {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"}, {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"}, {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"}, @@ -1425,7 +1364,6 @@ files = [ {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"}, - {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"}, {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"}, {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"}, {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"}, @@ -1854,4 +1792,4 @@ dev = ["doc8", "flake8", "flake8-import-order", "rstcheck[sphinx]", "sphinx"] [metadata] lock-version = "2.0" python-versions = "^3.8" -content-hash = "a2361ee8d3d1ba3759884ddd38ad89bb5004ddd96fc5b05c06167a725bf69a60" +content-hash = "9317600cf7a573f42b19e9c4ff9df6754799eb69a43789c2ca013f8fd697dfe0" From d83bcdbd076f3dc5ec43497aa4a1acad3918edf6 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Mon, 13 Nov 2023 14:35:02 +0100 Subject: [PATCH 014/446] Changes -> Changelog in release note drafter We typically have a written section above these changes and replace this title with Changelog, might as well change the template so that it's correct from the start. --- .github/release-note.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/release-note.yml b/.github/release-note.yml index a8e0cbdcfd..9351f3896c 100644 --- a/.github/release-note.yml +++ b/.github/release-note.yml @@ -26,6 +26,6 @@ change-title-escapes: '\<*_&' # You can add # and @ to disable mentions, and ad # - 'patch' # default: patch template: | - ## Changes + ## Changelog $CHANGES From cf2a0b973d78c949d84dc302a634972b31adb607 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 15:14:11 +0100 Subject: [PATCH 015/446] use the right pyproject.toml for python_sdk --- python_sdk/examples/branch_create.py | 4 +--- python_sdk/examples/branch_create_sync.py | 4 +--- python_sdk/infrahub_ctl/branch.py | 2 +- python_sdk/infrahub_ctl/cli.py | 4 ++-- python_sdk/infrahub_ctl/client.py | 3 +-- python_sdk/infrahub_ctl/schema.py | 2 +- python_sdk/infrahub_ctl/validate.py | 2 +- python_sdk/tests/conftest.py | 1 + python_sdk/tests/integration/conftest.py | 10 +++++----- python_sdk/tests/integration/test_infrahub_client.py | 8 ++++---- .../tests/integration/test_infrahub_client_sync.py | 8 ++++---- python_sdk/tests/integration/test_node.py | 8 ++++---- python_sdk/tests/integration/test_object_store.py | 1 + python_sdk/tests/integration/test_schema.py | 4 ++-- python_sdk/tests/unit/ctl/test_branch_app.py | 3 ++- python_sdk/tests/unit/ctl/test_cli.py | 3 ++- python_sdk/tests/unit/ctl/test_schema_app.py | 5 +++-- python_sdk/tests/unit/ctl/test_validate_app.py | 3 ++- python_sdk/tests/unit/sdk/conftest.py | 3 ++- python_sdk/tests/unit/sdk/test_branch.py | 1 + python_sdk/tests/unit/sdk/test_client.py | 3 ++- python_sdk/tests/unit/sdk/test_config.py | 3 ++- python_sdk/tests/unit/sdk/test_graphql.py | 1 + python_sdk/tests/unit/sdk/test_node.py | 3 ++- python_sdk/tests/unit/sdk/test_object_store.py | 3 ++- python_sdk/tests/unit/sdk/test_schema.py | 1 + python_sdk/tests/unit/sdk/test_store.py | 1 + python_sdk/tests/unit/sdk/test_timestamp.py | 1 + python_sdk/tests/unit/sdk/test_utils.py | 1 + 29 files changed, 54 insertions(+), 42 deletions(-) diff --git a/python_sdk/examples/branch_create.py b/python_sdk/examples/branch_create.py index ac6300a585..e5e013919d 100644 --- a/python_sdk/examples/branch_create.py +++ b/python_sdk/examples/branch_create.py @@ -5,9 +5,7 @@ async def main(): client = await InfrahubClient.init(address="http://localhost:8000") - await client.branch.create( - branch_name="new-branch", description="description", data_only=True - ) + await client.branch.create(branch_name="new-branch", description="description", data_only=True) print("New branch created") diff --git a/python_sdk/examples/branch_create_sync.py b/python_sdk/examples/branch_create_sync.py index 332f6a5336..196daa0a28 100644 --- a/python_sdk/examples/branch_create_sync.py +++ b/python_sdk/examples/branch_create_sync.py @@ -3,9 +3,7 @@ def main(): client = InfrahubClientSync.init(address="http://localhost:8000") - client.branch.create( - branch_name="new-branch2", description="description", data_only=True - ) + client.branch.create(branch_name="new-branch2", description="description", data_only=True) print("New branch created") diff --git a/python_sdk/infrahub_ctl/branch.py b/python_sdk/infrahub_ctl/branch.py index 839d1c65b3..e1de2119c9 100644 --- a/python_sdk/infrahub_ctl/branch.py +++ b/python_sdk/infrahub_ctl/branch.py @@ -6,7 +6,6 @@ from typing import Dict, Generator, List, Optional, Union import typer -from infrahub_sdk import Error, GraphQLError from rich.console import Console from rich.console import group as rich_group from rich.panel import Panel @@ -19,6 +18,7 @@ print_graphql_errors, render_action_rich, ) +from infrahub_sdk import Error, GraphQLError app = typer.Typer() diff --git a/python_sdk/infrahub_ctl/cli.py b/python_sdk/infrahub_ctl/cli.py index 32f3f74fb5..e88aa5bf10 100644 --- a/python_sdk/infrahub_ctl/cli.py +++ b/python_sdk/infrahub_ctl/cli.py @@ -9,8 +9,6 @@ import jinja2 import typer -from infrahub_sdk.exceptions import GraphQLError -from infrahub_sdk.schema import InfrahubRepositoryConfig from pydantic import ValidationError from rich.console import Console from rich.logging import RichHandler @@ -31,6 +29,8 @@ parse_cli_vars, ) from infrahub_ctl.validate import app as validate_app +from infrahub_sdk.exceptions import GraphQLError +from infrahub_sdk.schema import InfrahubRepositoryConfig app = typer.Typer(pretty_exceptions_show_locals=False) diff --git a/python_sdk/infrahub_ctl/client.py b/python_sdk/infrahub_ctl/client.py index 0d6c0ed89f..bb43a1c00a 100644 --- a/python_sdk/infrahub_ctl/client.py +++ b/python_sdk/infrahub_ctl/client.py @@ -1,10 +1,9 @@ from typing import Any +import infrahub_ctl.config as config from infrahub_sdk import InfrahubClient, InfrahubClientSync from infrahub_sdk.config import Config -import infrahub_ctl.config as config - async def initialize_client(**kwargs: Any) -> InfrahubClient: client_config = {} diff --git a/python_sdk/infrahub_ctl/schema.py b/python_sdk/infrahub_ctl/schema.py index 9d7d6ac12e..f031f79c0b 100644 --- a/python_sdk/infrahub_ctl/schema.py +++ b/python_sdk/infrahub_ctl/schema.py @@ -5,13 +5,13 @@ import typer import yaml -from infrahub_sdk.utils import find_files from pydantic import BaseModel, ValidationError from rich.console import Console from rich.logging import RichHandler import infrahub_ctl.config as config from infrahub_ctl.client import initialize_client +from infrahub_sdk.utils import find_files app = typer.Typer() diff --git a/python_sdk/infrahub_ctl/validate.py b/python_sdk/infrahub_ctl/validate.py index b528d4483d..7f6094be5b 100644 --- a/python_sdk/infrahub_ctl/validate.py +++ b/python_sdk/infrahub_ctl/validate.py @@ -5,7 +5,6 @@ import typer import yaml -from infrahub_sdk.exceptions import GraphQLError from pydantic import ValidationError from rich.console import Console from ujson import JSONDecodeError @@ -14,6 +13,7 @@ from infrahub_ctl.client import initialize_client, initialize_client_sync from infrahub_ctl.exceptions import QueryNotFoundError from infrahub_ctl.utils import find_graphql_query, get_branch, parse_cli_vars +from infrahub_sdk.exceptions import GraphQLError app = typer.Typer() diff --git a/python_sdk/tests/conftest.py b/python_sdk/tests/conftest.py index afd1519832..66217e5971 100644 --- a/python_sdk/tests/conftest.py +++ b/python_sdk/tests/conftest.py @@ -1,6 +1,7 @@ import asyncio import pytest + from infrahub_ctl import config diff --git a/python_sdk/tests/integration/conftest.py b/python_sdk/tests/integration/conftest.py index 067ecdb1ac..a384d2f218 100644 --- a/python_sdk/tests/integration/conftest.py +++ b/python_sdk/tests/integration/conftest.py @@ -4,19 +4,19 @@ from typing import Any, Dict, Optional import httpx +import infrahub.config as config import pytest from fastapi.testclient import TestClient -from infrahub_sdk.schema import NodeSchema -from infrahub_sdk.types import HTTPMethod -from infrahub_sdk.utils import str_to_bool - -import infrahub.config as config from infrahub.core.initialization import first_time_initialization, initialization from infrahub.core.node import Node from infrahub.core.utils import delete_all_nodes from infrahub.database import InfrahubDatabase, get_db from infrahub.lock import initialize_lock +from infrahub_sdk.schema import NodeSchema +from infrahub_sdk.types import HTTPMethod +from infrahub_sdk.utils import str_to_bool + BUILD_NAME = os.environ.get("INFRAHUB_BUILD_NAME", "infrahub") TEST_IN_DOCKER = str_to_bool(os.environ.get("INFRAHUB_TEST_IN_DOCKER", "false")) diff --git a/python_sdk/tests/integration/test_infrahub_client.py b/python_sdk/tests/integration/test_infrahub_client.py index 15589262d5..9977c8f344 100644 --- a/python_sdk/tests/integration/test_infrahub_client.py +++ b/python_sdk/tests/integration/test_infrahub_client.py @@ -1,13 +1,13 @@ import pytest -from infrahub_sdk import Config, InfrahubClient -from infrahub_sdk.exceptions import BranchNotFound -from infrahub_sdk.node import InfrahubNode - from infrahub.core import registry from infrahub.core.initialization import create_branch from infrahub.core.node import Node from infrahub.database import InfrahubDatabase +from infrahub_sdk import Config, InfrahubClient +from infrahub_sdk.exceptions import BranchNotFound +from infrahub_sdk.node import InfrahubNode + from .conftest import InfrahubTestClient # pylint: disable=unused-argument diff --git a/python_sdk/tests/integration/test_infrahub_client_sync.py b/python_sdk/tests/integration/test_infrahub_client_sync.py index 939c5f998a..abcf4fd2e1 100644 --- a/python_sdk/tests/integration/test_infrahub_client_sync.py +++ b/python_sdk/tests/integration/test_infrahub_client_sync.py @@ -1,12 +1,12 @@ import pytest -from infrahub_sdk import Config, InfrahubClientSync -from infrahub_sdk.exceptions import BranchNotFound -from infrahub_sdk.node import InfrahubNodeSync - from infrahub.core.initialization import create_branch from infrahub.core.node import Node from infrahub.database import InfrahubDatabase +from infrahub_sdk import Config, InfrahubClientSync +from infrahub_sdk.exceptions import BranchNotFound +from infrahub_sdk.node import InfrahubNodeSync + from .conftest import InfrahubTestClient # pylint: disable=unused-argument diff --git a/python_sdk/tests/integration/test_node.py b/python_sdk/tests/integration/test_node.py index 8aa7240520..c85e7c0bdf 100644 --- a/python_sdk/tests/integration/test_node.py +++ b/python_sdk/tests/integration/test_node.py @@ -1,12 +1,12 @@ import pytest -from infrahub_sdk import Config, InfrahubClient -from infrahub_sdk.exceptions import NodeNotFound -from infrahub_sdk.node import InfrahubNode - from infrahub.core.manager import NodeManager from infrahub.core.node import Node from infrahub.database import InfrahubDatabase +from infrahub_sdk import Config, InfrahubClient +from infrahub_sdk.exceptions import NodeNotFound +from infrahub_sdk.node import InfrahubNode + from .conftest import InfrahubTestClient # pylint: disable=unused-argument diff --git a/python_sdk/tests/integration/test_object_store.py b/python_sdk/tests/integration/test_object_store.py index 0470c9b667..81905d5a4e 100644 --- a/python_sdk/tests/integration/test_object_store.py +++ b/python_sdk/tests/integration/test_object_store.py @@ -1,4 +1,5 @@ import pytest + from infrahub_sdk import Config, InfrahubClient from .conftest import InfrahubTestClient diff --git a/python_sdk/tests/integration/test_schema.py b/python_sdk/tests/integration/test_schema.py index c42894e47d..e6712002a0 100644 --- a/python_sdk/tests/integration/test_schema.py +++ b/python_sdk/tests/integration/test_schema.py @@ -1,9 +1,9 @@ import pytest +from infrahub.core.schema import core_models + from infrahub_sdk import Config, InfrahubClient from infrahub_sdk.schema import NodeSchema -from infrahub.core.schema import core_models - from .conftest import InfrahubTestClient # pylint: disable=unused-argument diff --git a/python_sdk/tests/unit/ctl/test_branch_app.py b/python_sdk/tests/unit/ctl/test_branch_app.py index 149846a257..40160cf5e0 100644 --- a/python_sdk/tests/unit/ctl/test_branch_app.py +++ b/python_sdk/tests/unit/ctl/test_branch_app.py @@ -1,7 +1,8 @@ -from infrahub_ctl.branch import app from pytest_httpx import HTTPXMock from typer.testing import CliRunner +from infrahub_ctl.branch import app + runner = CliRunner() # pylint: disable=unused-argument diff --git a/python_sdk/tests/unit/ctl/test_cli.py b/python_sdk/tests/unit/ctl/test_cli.py index 6edf5043bb..84f77bbc72 100644 --- a/python_sdk/tests/unit/ctl/test_cli.py +++ b/python_sdk/tests/unit/ctl/test_cli.py @@ -1,6 +1,7 @@ -from infrahub_ctl.cli import app from typer.testing import CliRunner +from infrahub_ctl.cli import app + runner = CliRunner() diff --git a/python_sdk/tests/unit/ctl/test_schema_app.py b/python_sdk/tests/unit/ctl/test_schema_app.py index 47309c29b8..ed5bf10292 100644 --- a/python_sdk/tests/unit/ctl/test_schema_app.py +++ b/python_sdk/tests/unit/ctl/test_schema_app.py @@ -3,11 +3,12 @@ import pytest import yaml -from infrahub_ctl.schema import app -from infrahub_ctl.utils import get_fixtures_dir from pytest_httpx import HTTPXMock from typer.testing import CliRunner +from infrahub_ctl.schema import app +from infrahub_ctl.utils import get_fixtures_dir + runner = CliRunner() diff --git a/python_sdk/tests/unit/ctl/test_validate_app.py b/python_sdk/tests/unit/ctl/test_validate_app.py index 0e5c894364..fece5def00 100644 --- a/python_sdk/tests/unit/ctl/test_validate_app.py +++ b/python_sdk/tests/unit/ctl/test_validate_app.py @@ -1,9 +1,10 @@ import os import pytest +from typer.testing import CliRunner + from infrahub_ctl.utils import get_fixtures_dir from infrahub_ctl.validate import app -from typer.testing import CliRunner runner = CliRunner() diff --git a/python_sdk/tests/unit/sdk/conftest.py b/python_sdk/tests/unit/sdk/conftest.py index 9ed3fc776a..a95981f471 100644 --- a/python_sdk/tests/unit/sdk/conftest.py +++ b/python_sdk/tests/unit/sdk/conftest.py @@ -4,10 +4,11 @@ import pytest import ujson +from pytest_httpx import HTTPXMock + from infrahub_sdk import InfrahubClient, InfrahubClientSync from infrahub_sdk.schema import BranchSupportType, NodeSchema from infrahub_sdk.utils import get_fixtures_dir -from pytest_httpx import HTTPXMock # pylint: disable=redefined-outer-name,unused-argument diff --git a/python_sdk/tests/unit/sdk/test_branch.py b/python_sdk/tests/unit/sdk/test_branch.py index e06542cebe..df08a5f99e 100644 --- a/python_sdk/tests/unit/sdk/test_branch.py +++ b/python_sdk/tests/unit/sdk/test_branch.py @@ -1,6 +1,7 @@ import inspect import pytest + from infrahub_sdk.branch import ( BranchData, InfrahubBranchManager, diff --git a/python_sdk/tests/unit/sdk/test_client.py b/python_sdk/tests/unit/sdk/test_client.py index fad5884ba5..5e7316f7eb 100644 --- a/python_sdk/tests/unit/sdk/test_client.py +++ b/python_sdk/tests/unit/sdk/test_client.py @@ -1,11 +1,12 @@ import inspect import pytest +from pytest_httpx import HTTPXMock + from infrahub_sdk import InfrahubClient, InfrahubClientSync from infrahub_sdk.data import RepositoryData from infrahub_sdk.exceptions import FilterNotFound, NodeNotFound from infrahub_sdk.node import InfrahubNode, InfrahubNodeSync -from pytest_httpx import HTTPXMock async_client_methods = [method for method in dir(InfrahubClient) if not method.startswith("_")] sync_client_methods = [method for method in dir(InfrahubClientSync) if not method.startswith("_")] diff --git a/python_sdk/tests/unit/sdk/test_config.py b/python_sdk/tests/unit/sdk/test_config.py index fbf266b475..469954e7e8 100644 --- a/python_sdk/tests/unit/sdk/test_config.py +++ b/python_sdk/tests/unit/sdk/test_config.py @@ -1,7 +1,8 @@ import pytest -from infrahub_sdk.config import Config from pydantic.error_wrappers import ValidationError +from infrahub_sdk.config import Config + def test_combine_authentications(): with pytest.raises(ValidationError) as exc: diff --git a/python_sdk/tests/unit/sdk/test_graphql.py b/python_sdk/tests/unit/sdk/test_graphql.py index 6ae52d8cd2..b9e88b055c 100644 --- a/python_sdk/tests/unit/sdk/test_graphql.py +++ b/python_sdk/tests/unit/sdk/test_graphql.py @@ -1,4 +1,5 @@ import pytest + from infrahub_sdk.graphql import Mutation, Query, render_input_block, render_query_block # pylint: disable=redefined-outer-name diff --git a/python_sdk/tests/unit/sdk/test_node.py b/python_sdk/tests/unit/sdk/test_node.py index 1cb61b35db..7f1e5e1976 100644 --- a/python_sdk/tests/unit/sdk/test_node.py +++ b/python_sdk/tests/unit/sdk/test_node.py @@ -3,6 +3,8 @@ from typing import TYPE_CHECKING import pytest +from pytest_httpx import HTTPXMock + from infrahub_sdk.exceptions import NodeNotFound from infrahub_sdk.node import ( SAFE_VALUE, @@ -12,7 +14,6 @@ RelatedNodeBase, RelationshipManagerBase, ) -from pytest_httpx import HTTPXMock if TYPE_CHECKING: from infrahub_sdk.client import InfrahubClient, InfrahubClientSync diff --git a/python_sdk/tests/unit/sdk/test_object_store.py b/python_sdk/tests/unit/sdk/test_object_store.py index 72733ec816..8da5d46039 100644 --- a/python_sdk/tests/unit/sdk/test_object_store.py +++ b/python_sdk/tests/unit/sdk/test_object_store.py @@ -1,9 +1,10 @@ import inspect import pytest -from infrahub_sdk.object_store import ObjectStore, ObjectStoreSync from pytest_httpx import HTTPXMock +from infrahub_sdk.object_store import ObjectStore, ObjectStoreSync + # pylint: disable=redefined-outer-name,unused-argument async_methods = [method for method in dir(ObjectStore) if not method.startswith("_")] diff --git a/python_sdk/tests/unit/sdk/test_schema.py b/python_sdk/tests/unit/sdk/test_schema.py index 497140e058..70c3ebda09 100644 --- a/python_sdk/tests/unit/sdk/test_schema.py +++ b/python_sdk/tests/unit/sdk/test_schema.py @@ -1,6 +1,7 @@ import inspect import pytest + from infrahub_sdk import InfrahubClient, InfrahubClientSync, ValidationError from infrahub_sdk.schema import InfrahubSchema, InfrahubSchemaSync, NodeSchema diff --git a/python_sdk/tests/unit/sdk/test_store.py b/python_sdk/tests/unit/sdk/test_store.py index 795bb8983a..ea298b39ef 100644 --- a/python_sdk/tests/unit/sdk/test_store.py +++ b/python_sdk/tests/unit/sdk/test_store.py @@ -1,4 +1,5 @@ import pytest + from infrahub_sdk import InfrahubNode, NodeStore client_types = ["standard", "sync"] diff --git a/python_sdk/tests/unit/sdk/test_timestamp.py b/python_sdk/tests/unit/sdk/test_timestamp.py index cde13a566a..12c7aa0627 100644 --- a/python_sdk/tests/unit/sdk/test_timestamp.py +++ b/python_sdk/tests/unit/sdk/test_timestamp.py @@ -1,5 +1,6 @@ import pendulum import pytest + from infrahub_sdk.timestamp import Timestamp diff --git a/python_sdk/tests/unit/sdk/test_utils.py b/python_sdk/tests/unit/sdk/test_utils.py index bef0140a43..3ff235d6b2 100644 --- a/python_sdk/tests/unit/sdk/test_utils.py +++ b/python_sdk/tests/unit/sdk/test_utils.py @@ -1,6 +1,7 @@ import uuid import pytest + from infrahub_sdk.node import InfrahubNode from infrahub_sdk.utils import ( base16decode, From 595d6caa84d78f6e6b5f36643ad7e7fd6cf4880d Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 15:14:26 +0100 Subject: [PATCH 016/446] update sdk tasks --- tasks/sdk.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/tasks/sdk.py b/tasks/sdk.py index 3c90a0eadb..263e8d7bca 100644 --- a/tasks/sdk.py +++ b/tasks/sdk.py @@ -26,7 +26,7 @@ def format_ruff(context: Context): """Run ruff to format all Python files.""" print(f" - [{NAMESPACE}] Format code with ruff") - exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" + exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {MAIN_DIRECTORY}/pyproject.toml" with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) @@ -59,7 +59,7 @@ def ruff(context: Context, docker: bool = False): """Run ruff to check that Python files adherence to black standards.""" print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = "ruff check . --fix --config {REPO_BASE}/pyproject.toml" + exec_cmd = "ruff check . --fix --config pyproject.toml" exec_directory = MAIN_DIRECTORY_PATH if docker: From 40846fbed9295e8293d3b172564dab342b0b71c3 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 15:46:52 +0100 Subject: [PATCH 017/446] change fix and diff for ruff --- .github/workflows/ci.yml | 4 +++- tasks/backend.py | 5 +++-- tasks/ctl.py | 5 +++-- tasks/main.py | 3 ++- tasks/sdk.py | 9 ++++++--- tasks/sync.py | 3 ++- 6 files changed, 19 insertions(+), 10 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 8b1f027ed3..2feaf056ad 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -107,7 +107,9 @@ jobs: - name: "Setup environment" run: "pip install ruff==0.1.5" - name: "Linting: ruff" - run: "ruff check . --fix --config pyproject.toml" + run: "ruff check --diff . --exclude=python_sdk --config pyproject.toml" + - name: "Linting: ruff for python_SDK" + run: "ruff check --diff python_sdk --config python_sdk/pyproject.toml" # ------------------------------------------ Build Docker Image ------------------------------------------ # backend-build-docker: diff --git a/tasks/backend.py b/tasks/backend.py index 4316304a61..a59345f7e6 100644 --- a/tasks/backend.py +++ b/tasks/backend.py @@ -43,7 +43,8 @@ def format_ruff(context: Context): """Run ruff to format all Python files.""" print(f" - [{NAMESPACE}] Format code with ruff") - exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" + exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml && " + exec_cmd += f"ruff check --fix {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) @@ -76,7 +77,7 @@ def ruff(context: Context, docker: bool = False): """Run ruff to check that Python files adherence to black standards.""" print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix --config {REPO_BASE}/pyproject.toml" + exec_cmd = f"ruff check --diff {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) diff --git a/tasks/ctl.py b/tasks/ctl.py index f017d5a7bb..22fe130717 100644 --- a/tasks/ctl.py +++ b/tasks/ctl.py @@ -42,7 +42,8 @@ def format_ruff(context: Context): """Run ruff to format all Python files.""" print(f" - [{NAMESPACE}] Format code with ruff") - exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" + exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml && " + exec_cmd += f"ruff check --fix {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) @@ -75,7 +76,7 @@ def ruff(context: Context, docker: bool = False): """Run ruff to check that Python files adherence to standards.""" print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix --config {REPO_BASE}/pyproject.toml" + exec_cmd = f"ruff check --diff {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) diff --git a/tasks/main.py b/tasks/main.py index 48b55ed240..2c51b11004 100644 --- a/tasks/main.py +++ b/tasks/main.py @@ -14,8 +14,9 @@ def format_ruff(context: Context): """Run ruff to format all Python files.""" print(f" - [{NAMESPACE}] Format code with ruff") + exec_cmd = f"ruff format {MAIN_DIRECTORY} models/ --config {REPO_BASE}/pyproject.toml && " + exec_cmd += f"ruff check --fix {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" with context.cd(ESCAPED_REPO_PATH): - exec_cmd = f"ruff format {MAIN_DIRECTORY} models/ --config {REPO_BASE}/pyproject.toml" context.run(exec_cmd) diff --git a/tasks/sdk.py b/tasks/sdk.py index 263e8d7bca..043f13ea29 100644 --- a/tasks/sdk.py +++ b/tasks/sdk.py @@ -11,7 +11,7 @@ execute_command, get_env_vars, ) -from .utils import ESCAPED_REPO_PATH, REPO_BASE +from .utils import ESCAPED_REPO_PATH MAIN_DIRECTORY = "python_sdk" NAMESPACE = "SDK" @@ -26,7 +26,8 @@ def format_ruff(context: Context): """Run ruff to format all Python files.""" print(f" - [{NAMESPACE}] Format code with ruff") - exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {MAIN_DIRECTORY}/pyproject.toml" + exec_cmd = f"ruff format {MAIN_DIRECTORY}/ --config {MAIN_DIRECTORY}/pyproject.toml && " + exec_cmd += f"ruff check --fix {MAIN_DIRECTORY}/ --config {MAIN_DIRECTORY}/pyproject.toml" with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) @@ -59,10 +60,12 @@ def ruff(context: Context, docker: bool = False): """Run ruff to check that Python files adherence to black standards.""" print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = "ruff check . --fix --config pyproject.toml" exec_directory = MAIN_DIRECTORY_PATH + if not docker: + exec_cmd = f"ruff check --diff {exec_directory} --config {exec_directory}/pyproject.toml" if docker: + exec_cmd = "ruff check --diff . --config pyproject.toml" compose_files_cmd = build_test_compose_files_cmd(database=False) exec_cmd = ( f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME}" diff --git a/tasks/sync.py b/tasks/sync.py index 957253a59f..ba5a074d04 100644 --- a/tasks/sync.py +++ b/tasks/sync.py @@ -23,6 +23,7 @@ def format_ruff(context: Context): print(f" - [{NAMESPACE}] Format code with ruff") exec_cmd = f"ruff format {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" + exec_cmd += f"ruff check --fix {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) @@ -55,7 +56,7 @@ def ruff(context: Context, docker: bool = False): """Run ruff to check that Python files adherence to black standards.""" print(f" - [{NAMESPACE}] Check code with ruff") - exec_cmd = f"ruff check {MAIN_DIRECTORY} --fix --config {REPO_BASE}/pyproject.toml" + exec_cmd = f"ruff check --diff {MAIN_DIRECTORY} --config {REPO_BASE}/pyproject.toml" if docker: compose_files_cmd = build_test_compose_files_cmd(database=False) From 5a5a6c0286d658eebbc23b503e7154c21fd61422 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 13 Nov 2023 16:11:19 +0100 Subject: [PATCH 018/446] thanks Patrick to double checking the behaviour --- .github/workflows/ci.yml | 4 +--- python_sdk/poetry.lock | 40 +++++++++++++++++++-------------------- python_sdk/pyproject.toml | 2 +- 3 files changed, 22 insertions(+), 24 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 2feaf056ad..d17b327747 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -107,9 +107,7 @@ jobs: - name: "Setup environment" run: "pip install ruff==0.1.5" - name: "Linting: ruff" - run: "ruff check --diff . --exclude=python_sdk --config pyproject.toml" - - name: "Linting: ruff for python_SDK" - run: "ruff check --diff python_sdk --config python_sdk/pyproject.toml" + run: "ruff check --diff ." # ------------------------------------------ Build Docker Image ------------------------------------------ # backend-build-docker: diff --git a/python_sdk/poetry.lock b/python_sdk/poetry.lock index 4096fb35de..522a549fd0 100644 --- a/python_sdk/poetry.lock +++ b/python_sdk/poetry.lock @@ -1411,28 +1411,28 @@ jupyter = ["ipywidgets (>=7.5.1,<9)"] [[package]] name = "ruff" -version = "0.1.0" -description = "An extremely fast Python linter, written in Rust." +version = "0.1.5" +description = "An extremely fast Python linter and code formatter, written in Rust." optional = false python-versions = ">=3.7" files = [ - {file = "ruff-0.1.0-py3-none-macosx_10_7_x86_64.whl", hash = "sha256:87114e254dee35e069e1b922d85d4b21a5b61aec759849f393e1dbb308a00439"}, - {file = "ruff-0.1.0-py3-none-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:764f36d2982cc4a703e69fb73a280b7c539fd74b50c9ee531a4e3fe88152f521"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:65f4b7fb539e5cf0f71e9bd74f8ddab74cabdd673c6fb7f17a4dcfd29f126255"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:299fff467a0f163baa282266b310589b21400de0a42d8f68553422fa6bf7ee01"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0d412678bf205787263bb702c984012a4f97e460944c072fd7cfa2bd084857c4"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:a5391b49b1669b540924640587d8d24128e45be17d1a916b1801d6645e831581"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ee8cd57f454cdd77bbcf1e11ff4e0046fb6547cac1922cc6e3583ce4b9c326d1"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fa7aeed7bc23861a2b38319b636737bf11cfa55d2109620b49cf995663d3e888"}, - {file = "ruff-0.1.0-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b04cd4298b43b16824d9a37800e4c145ba75c29c43ce0d74cad1d66d7ae0a4c5"}, - {file = "ruff-0.1.0-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:7186ccf54707801d91e6314a016d1c7895e21d2e4cd614500d55870ed983aa9f"}, - {file = "ruff-0.1.0-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:d88adfd93849bc62449518228581d132e2023e30ebd2da097f73059900d8dce3"}, - {file = "ruff-0.1.0-py3-none-musllinux_1_2_i686.whl", hash = "sha256:ad2ccdb3bad5a61013c76a9c1240fdfadf2c7103a2aeebd7bcbbed61f363138f"}, - {file = "ruff-0.1.0-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:b77f6cfa72c6eb19b5cac967cc49762ae14d036db033f7d97a72912770fd8e1c"}, - {file = "ruff-0.1.0-py3-none-win32.whl", hash = "sha256:480bd704e8af1afe3fd444cc52e3c900b936e6ca0baf4fb0281124330b6ceba2"}, - {file = "ruff-0.1.0-py3-none-win_amd64.whl", hash = "sha256:a76ba81860f7ee1f2d5651983f87beb835def94425022dc5f0803108f1b8bfa2"}, - {file = "ruff-0.1.0-py3-none-win_arm64.whl", hash = "sha256:45abdbdab22509a2c6052ecf7050b3f5c7d6b7898dc07e82869401b531d46da4"}, - {file = "ruff-0.1.0.tar.gz", hash = "sha256:ad6b13824714b19c5f8225871cf532afb994470eecb74631cd3500fe817e6b3f"}, + {file = "ruff-0.1.5-py3-none-macosx_10_7_x86_64.whl", hash = "sha256:32d47fc69261c21a4c48916f16ca272bf2f273eb635d91c65d5cd548bf1f3d96"}, + {file = "ruff-0.1.5-py3-none-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:171276c1df6c07fa0597fb946139ced1c2978f4f0b8254f201281729981f3c17"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:17ef33cd0bb7316ca65649fc748acc1406dfa4da96a3d0cde6d52f2e866c7b39"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b2c205827b3f8c13b4a432e9585750b93fd907986fe1aec62b2a02cf4401eee6"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bb408e3a2ad8f6881d0f2e7ad70cddb3ed9f200eb3517a91a245bbe27101d379"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:f20dc5e5905ddb407060ca27267c7174f532375c08076d1a953cf7bb016f5a24"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aafb9d2b671ed934998e881e2c0f5845a4295e84e719359c71c39a5363cccc91"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a4894dddb476597a0ba4473d72a23151b8b3b0b5f958f2cf4d3f1c572cdb7af7"}, + {file = "ruff-0.1.5-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a00a7ec893f665ed60008c70fe9eeb58d210e6b4d83ec6654a9904871f982a2a"}, + {file = "ruff-0.1.5-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:a8c11206b47f283cbda399a654fd0178d7a389e631f19f51da15cbe631480c5b"}, + {file = "ruff-0.1.5-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:fa29e67b3284b9a79b1a85ee66e293a94ac6b7bb068b307a8a373c3d343aa8ec"}, + {file = "ruff-0.1.5-py3-none-musllinux_1_2_i686.whl", hash = "sha256:9b97fd6da44d6cceb188147b68db69a5741fbc736465b5cea3928fdac0bc1aeb"}, + {file = "ruff-0.1.5-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:721f4b9d3b4161df8dc9f09aa8562e39d14e55a4dbaa451a8e55bdc9590e20f4"}, + {file = "ruff-0.1.5-py3-none-win32.whl", hash = "sha256:f80c73bba6bc69e4fdc73b3991db0b546ce641bdcd5b07210b8ad6f64c79f1ab"}, + {file = "ruff-0.1.5-py3-none-win_amd64.whl", hash = "sha256:c21fe20ee7d76206d290a76271c1af7a5096bc4c73ab9383ed2ad35f852a0087"}, + {file = "ruff-0.1.5-py3-none-win_arm64.whl", hash = "sha256:82bfcb9927e88c1ed50f49ac6c9728dab3ea451212693fe40d08d314663e412f"}, + {file = "ruff-0.1.5.tar.gz", hash = "sha256:5cbec0ef2ae1748fb194f420fb03fb2c25c3258c86129af7172ff8f198f125ab"}, ] [[package]] @@ -1792,4 +1792,4 @@ dev = ["doc8", "flake8", "flake8-import-order", "rstcheck[sphinx]", "sphinx"] [metadata] lock-version = "2.0" python-versions = "^3.8" -content-hash = "9317600cf7a573f42b19e9c4ff9df6754799eb69a43789c2ca013f8fd697dfe0" +content-hash = "c99d3b78ea0bf7c735448b0aa273131305358985357d570b5ae9b05567d3588e" diff --git a/python_sdk/pyproject.toml b/python_sdk/pyproject.toml index a2bf834fd6..ca01693636 100644 --- a/python_sdk/pyproject.toml +++ b/python_sdk/pyproject.toml @@ -53,7 +53,7 @@ types-ujson = "*" types-pyyaml = "*" typer-cli = "*" pytest-cov = "^4.0.0" -ruff = "0.1.0" +ruff = "^0.1.5" pytest-xdist = "^3.3.1" buildkite-test-collector = "^0.1.7" types-python-slugify = "^8.0.0.3" From 4913241dcca4f20222ef62e08f5b40b1ca5c10ca Mon Sep 17 00:00:00 2001 From: pa-lem Date: Mon, 13 Nov 2023 17:28:17 +0100 Subject: [PATCH 019/446] fix icon for frontend and add route for backend --- backend/infrahub/server.py | 2 +- frontend/index.html | 6 +++--- frontend/public/{ => favicons}/favicon.ico | Bin frontend/public/{ => favicons}/logo192.png | Bin frontend/public/{ => favicons}/logo512.png | Bin frontend/public/manifest.json | 10 +++++----- 6 files changed, 9 insertions(+), 9 deletions(-) rename frontend/public/{ => favicons}/favicon.ico (100%) rename frontend/public/{ => favicons}/logo192.png (100%) rename frontend/public/{ => favicons}/logo512.png (100%) diff --git a/backend/infrahub/server.py b/backend/infrahub/server.py index d2d5b2ab32..2d1439847a 100644 --- a/backend/infrahub/server.py +++ b/backend/infrahub/server.py @@ -161,7 +161,7 @@ async def api_exception_handler_base_infrahub_error(_: Request, exc: Error) -> J if os.path.exists(FRONTEND_ASSET_DIRECTORY) and os.path.isdir(FRONTEND_ASSET_DIRECTORY): app.mount("/assets", StaticFiles(directory=FRONTEND_ASSET_DIRECTORY), "assets") - + app.mount("/favicons", StaticFiles(directory=FRONTEND_ASSET_DIRECTORY), "favicons") @app.get("/{rest_of_path:path}", include_in_schema=False) async def react_app(req: Request, rest_of_path: str): # pylint: disable=unused-argument diff --git a/frontend/index.html b/frontend/index.html index a75962dccc..30e07d4cf3 100644 --- a/frontend/index.html +++ b/frontend/index.html @@ -2,16 +2,16 @@ - + - + - + + diff --git a/docs/getting-started/index.yml b/docs/tutorials/getting-started/index.yml similarity index 77% rename from docs/getting-started/index.yml rename to docs/tutorials/getting-started/index.yml index 2c2e7ce054..4bf0ac2d50 100644 --- a/docs/getting-started/index.yml +++ b/docs/tutorials/getting-started/index.yml @@ -1,4 +1,3 @@ --- label: Getting Started icon: "rocket" -order: 1000 diff --git a/docs/tutorial/introduction-to-infrahub.md b/docs/tutorials/getting-started/introduction-to-infrahub.md similarity index 95% rename from docs/tutorial/introduction-to-infrahub.md rename to docs/tutorials/getting-started/introduction-to-infrahub.md index baaae4b9fb..3f148ec482 100644 --- a/docs/tutorial/introduction-to-infrahub.md +++ b/docs/tutorials/getting-started/introduction-to-infrahub.md @@ -9,7 +9,7 @@ Before starting this tutorial, let's take a moment to explore how Infrahub is or ## Infrahub Components -![](../media/high_level_architecture.excalidraw.svg) +![](../../media/high_level_architecture.excalidraw.svg) During this tutorial we'll mainly use the Frontend, the `infrahubctl` CLI and GraphQL via the API Server. diff --git a/docs/tutorial/jinja2-integration.md b/docs/tutorials/getting-started/jinja2-integration.md similarity index 91% rename from docs/tutorial/jinja2-integration.md rename to docs/tutorials/getting-started/jinja2-integration.md index cf4c3ba94e..e7173aa61f 100644 --- a/docs/tutorial/jinja2-integration.md +++ b/docs/tutorials/getting-started/jinja2-integration.md @@ -33,7 +33,7 @@ Next, we'll create a new branch, and make modifications both in the data and in From the frontend, create a new branch named `update-ethernet1` and be sure to uncheck the toggle `is Data Only` in the UI. -![Create a new branch (not with Data Only)](../media/tutorial/tutorial-6-git-integration.cy.ts/tutorial_6_branch_creation.png) +![Create a new branch (not with Data Only)](../../media/tutorial/tutorial-6-git-integration.cy.ts/tutorial_6_branch_creation.png) #### 2. Update the interface Ethernet 1 for atl1-edge1 @@ -46,7 +46,7 @@ Now we'll make a change in the branch `update-ethernet1` that will be reflected 5. Update its description to `New description in the branch` 6. Save your change -![Update the description for the interface Ethernet1](../media/tutorial/tutorial-6-git-integration.cy.ts/tutorial_6_interface_update.png) +![Update the description for the interface Ethernet1](../../media/tutorial/tutorial-6-git-integration.cy.ts/tutorial_6_interface_update.png) #### 3. Update the Jinja2 template in Github @@ -60,7 +60,7 @@ In Github: - Delete the lines 77 and 78 (i.e. the last two lines of 'ip prefix-list BOGON-Prefixes') - Commit your changes in the branch `update-ethernet1` directly from github -![Update the template in Github](../media/tutorial_rfile_update_jinja.gif) +![Update the template in Github](../../media/tutorial_rfile_update_jinja.gif) !!!success Validate that everything is correct diff --git a/docs/tutorial/lineage-information.md b/docs/tutorials/getting-started/lineage-information.md similarity index 92% rename from docs/tutorial/lineage-information.md rename to docs/tutorials/getting-started/lineage-information.md index d730d48f99..cd30a4cdb5 100644 --- a/docs/tutorial/lineage-information.md +++ b/docs/tutorials/getting-started/lineage-information.md @@ -25,7 +25,7 @@ If you navigate to the detailed page of any device you'll be able to see that: 2. The **role** has been defined by the `pop-builder`, is owner by the `Engineering Team` and `is_protected` as well 3. The **description** is neither protected nor does it has a source or a owner defined -![](../media/tutorial/tutorial-4-data.cy.ts/tutorial_4_metadata.png) +![](../../media/tutorial/tutorial-4-data.cy.ts/tutorial_4_metadata.png) ## Protected Field @@ -35,4 +35,4 @@ When a field is marked as protected, all users that aren't listed as the owner w It's possible to update the metadata by selecting the pencil on the top right corner of each metadata panel -![](../media/tutorial/tutorial-4-data.cy.ts/tutorial_4_metadata_edit.png) +![](../../media/tutorial/tutorial-4-data.cy.ts/tutorial_4_metadata_edit.png) diff --git a/docs/tutorial/readme.md b/docs/tutorials/getting-started/readme.md similarity index 58% rename from docs/tutorial/readme.md rename to docs/tutorials/getting-started/readme.md index 3699e8c85c..6b7ffe71eb 100644 --- a/docs/tutorial/readme.md +++ b/docs/tutorials/getting-started/readme.md @@ -1,4 +1,4 @@ -# Tutorial Intro +# Getting started with Infrahub This tutorial will get you started with Infrahub and it will help you get familiar with some of the main components and concepts behind Infrahub. To do that we'll use a sample dataset that represents a small network with 6 devices. This tutorial will teach you : - How to manage branches and query any branches @@ -18,20 +18,51 @@ The tutorial is meant to be executed in order, as we'll be making some changes a ## Prepare the Demo Environment -### Build the demo environment locally +### Pre-Requisite +In order to run the demo environment, the following applications must be installed on the systems: +- [pyinvoke](https://www.pyinvoke.org/) +- Docker & Docker Compose + +> On a Laptop, both Docker & Docker Compose can be installed by installing [Docker Desktop](https://www.docker.com/products/docker-desktop/) + +### First utilization + +Before the first utilization you need to build the images for Infrahub with the command: ``` invoke demo.build +``` +Initialize the database and start the application +``` invoke demo.start ``` -### User Accounts available for the tutorial +### Load some data + +Once you have an environment up and running you can load your own schema or you can explore the one provided with the project using the following commands. +``` +invoke demo.load-infra-schema +invoke demo.load-infra-data +``` + +### Control the local environment + +- `invoke demo.start` : Start all the containers in detached mode. +- `invoke demo.stop` : Stop All the containers +- `invoke demo.destroy` : Destroy all containers and volumes. + + +!!! +`invoke demo.debug` can be used as an alternative to `invoke demo.start`, the main difference is that it will stay *attached* to the containers and all the logs will be displayed in real time in the CLI. +!!! + +## User Accounts available for the tutorial Multiple user accounts with different levels of permissions are available. To follow the tutorial you should use the `admin` account but you can try the other accounts too to see how the interface behaves with different permission levels. | name | username | password | role | -|---------------|-----------------|---------------|------------| +| ------------- | --------------- | ------------- | ---------- | | Administrator | `admin` | `infrahub` | admin | | Chloe O'Brian | `Chloe O'Brian` | `Password123` | read-write | | David Palmer | `David Palmer` | `Password123` | read-only | diff --git a/docs/tutorial/schema.md b/docs/tutorials/getting-started/schema.md similarity index 98% rename from docs/tutorial/schema.md rename to docs/tutorials/getting-started/schema.md index 86113591de..8e034ddb69 100644 --- a/docs/tutorial/schema.md +++ b/docs/tutorials/getting-started/schema.md @@ -16,9 +16,9 @@ You can explore the current schema by visiting the schema page at the bottom of [!ref Explore the current schema](http://localhost:8000/api/schema) -![](../media/tutorial/tutorial-3-schema.cy.ts/tutorial_3_schema.png) +![](../../media/tutorial/tutorial-3-schema.cy.ts/tutorial_3_schema.png) -[!ref Check the schema documentation for more information](../schema/readme.md) +[!ref Check the schema documentation for more information](../../reference/schema/readme.md) ## Extend the schema with some network related models @@ -27,7 +27,7 @@ In order to model a simple network, we need to extend the current models to capt A schema extension with these type of models and more is available in the `models/` directory ==- Infrastructure Base Schema -:::code source="../../models/infrastructure_base.yml" ::: +:::code source="../../../models/infrastructure_base.yml" ::: ==- Use the following command to load these new models into Infrahub @@ -58,7 +58,7 @@ Schema loaded successfully! In order to have more meaningful data to explore, we'll use a sample topology of 6 devices as presented below that is leveraging all the new models we just added to the schema. -![](../media/demo_edge.excalidraw.svg) +![](../../media/demo_edge.excalidraw.svg) Use the following command to load these new models into Infrahub diff --git a/docs/tutorials/index.yml b/docs/tutorials/index.yml new file mode 100644 index 0000000000..9569c843db --- /dev/null +++ b/docs/tutorials/index.yml @@ -0,0 +1,4 @@ +--- +label: Tutorials +icon: "mortar-board" +order: 1000 From d978b71e72bcb4cbcbf1136862083a7eb3f42722 Mon Sep 17 00:00:00 2001 From: Bilal Date: Wed, 15 Nov 2023 22:59:05 +0100 Subject: [PATCH 044/446] fix ci --- frontend/tests/e2e/branches.cy.ts | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/frontend/tests/e2e/branches.cy.ts b/frontend/tests/e2e/branches.cy.ts index bf4fb349ed..296174cf0b 100644 --- a/frontend/tests/e2e/branches.cy.ts +++ b/frontend/tests/e2e/branches.cy.ts @@ -53,7 +53,7 @@ describe("Branches creation and deletion", () => { cy.get("[data-cy='branch-select-menu']").contains("test123"); cy.url().should("include", "/branches").and("include", "branch=test123"); - cy.get("[data-cy='branch-list-display-button']").click({ force: true }); + cy.get("[data-cy='branch-list-display-button']").click(); cy.get("[data-cy='branch-list-dropdown']").contains("test123"); cy.get("[data-cy='branch-list-dropdown']").contains("test456").should("not.exist"); }); From 46696d6f1222d2a54b44450cbe8f8936fc485797 Mon Sep 17 00:00:00 2001 From: Mark Michon Date: Wed, 15 Nov 2023 14:01:29 -0800 Subject: [PATCH 045/446] docs: update infrahub-cli output path --- tasks/backend.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tasks/backend.py b/tasks/backend.py index a59345f7e6..2ecd64259e 100644 --- a/tasks/backend.py +++ b/tasks/backend.py @@ -31,7 +31,7 @@ def generate_doc(context: Context): print(f" - [{NAMESPACE}] Generate CLI documentation") with context.cd(ESCAPED_REPO_PATH): for command in CLI_COMMANDS: - exec_cmd = f'typer {command[0]} utils docs --name "{command[1]}" --output docs/components/infrahub-cli/{command[2]}.md' + exec_cmd = f'typer {command[0]} utils docs --name "{command[1]}" --output docs/reference/infrahub-cli/{command[2]}.md' context.run(exec_cmd) From 72b85b1b27155e88d185dccbb5a92b3cf0f62f25 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Wed, 15 Nov 2023 19:51:55 -0500 Subject: [PATCH 046/446] Add support for Pydantic 2 in the SDK and add matrix in pipeline --- .github/workflows/ci.yml | 45 +++++++++++++++++++++++++++++++++++---- python_sdk/pyproject.toml | 2 +- 2 files changed, 42 insertions(+), 5 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 37c811a5f7..79c1730f80 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -111,7 +111,11 @@ jobs: - name: "Linting: ruff format" run: "ruff format --check --diff ." - python-sdk-tests: + python-sdk-unit-tests: + strategy: + matrix: + python-version: ["3.9", "3.10", "3.11"] + pydantic-version: ["^1.10", "^2"] if: | always() && !cancelled() && !contains(needs.*.result, 'failure') && @@ -125,8 +129,17 @@ jobs: steps: - name: "Check out repository code" uses: "actions/checkout@v3" - - name: "Install Invoke" - run: "pip install toml invoke" + - name: Set up Python ${{ matrix.python-version }} + uses: actions/setup-python@v4 + with: + python-version: ${{ matrix.python-version }} + - name: "Setup environment" + run: | + pipx install poetry + pip install invoke toml + - name: Set Version of Pydantic + run: poetry install pydantic@${{ matrix.pydantic-version }} + working-directory: python_sdk/ - name: "Build Test Image" run: "invoke test.build" - name: "Pull External Docker Images" @@ -146,6 +159,30 @@ jobs: with: flag-name: python-sdk-unit parallel: true + + + python-sdk-integration-tests: + if: | + always() && !cancelled() && + !contains(needs.*.result, 'failure') && + !contains(needs.*.result, 'cancelled') && + needs.files-changed.outputs.sdk == 'true' + needs: ["python-sdk-unit-tests"] + runs-on: "ubuntu-20.04" + timeout-minutes: 30 + env: + INFRAHUB_DB_TYPE: memgraph + steps: + - name: "Check out repository code" + uses: "actions/checkout@v3" + - name: "Install Invoke" + run: "pip install toml invoke" + - name: "Build Test Image" + run: "invoke test.build" + - name: "Pull External Docker Images" + run: "invoke test.pull" + - name: "Pylint Tests" + run: "invoke sdk.pylint --docker" - name: "Integration Tests" run: "invoke sdk.test-integration" env: @@ -428,7 +465,7 @@ jobs: # ------------------------------------------ Coverall Report ------------------------------------------ coverall-report: - needs: ["frontend-tests", "backend-tests-default", "python-sdk-tests"] + needs: ["frontend-tests", "backend-tests-default", "python-sdk-integration-tests"] if: | always() && !cancelled() runs-on: ubuntu-latest diff --git a/python_sdk/pyproject.toml b/python_sdk/pyproject.toml index ca01693636..2eeb83cb4d 100644 --- a/python_sdk/pyproject.toml +++ b/python_sdk/pyproject.toml @@ -32,7 +32,7 @@ toml = "^0.10.2" jsonlines = "^3.1" deepdiff = "^6.2" ujson = "^5.7" -pydantic = "^1.10" +pydantic = "^1.10, ^2" pyyaml = "^6.0" gitpython = "3.1.40" From 498938cb52be0c350827d661eb0f5926a52209a2 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Wed, 15 Nov 2023 21:40:26 -0500 Subject: [PATCH 047/446] Add support for both pydantic v1 and v2 in the SDK --- python_sdk/infrahub_ctl/cli.py | 9 +- python_sdk/infrahub_ctl/config.py | 12 +- python_sdk/infrahub_sdk/branch.py | 11 +- python_sdk/infrahub_sdk/config.py | 40 ++- python_sdk/infrahub_sdk/playback.py | 10 +- python_sdk/infrahub_sdk/recorder.py | 8 +- python_sdk/infrahub_sdk/schema.py | 95 +++--- python_sdk/poetry.lock | 429 ++++++++++++++++++++-------- python_sdk/pyproject.toml | 12 +- 9 files changed, 418 insertions(+), 208 deletions(-) diff --git a/python_sdk/infrahub_ctl/cli.py b/python_sdk/infrahub_ctl/cli.py index e88aa5bf10..40759eb635 100644 --- a/python_sdk/infrahub_ctl/cli.py +++ b/python_sdk/infrahub_ctl/cli.py @@ -9,7 +9,12 @@ import jinja2 import typer -from pydantic import ValidationError + +try: + from pydantic import v1 as pydantic +except ImportError: + import pydantic + from rich.console import Console from rich.logging import RichHandler from rich.syntax import Syntax @@ -132,7 +137,7 @@ def render( # pylint: disable=too-many-branches,too-many-statements try: data = InfrahubRepositoryConfig(**config_file_data) - except ValidationError as exc: + except pydantic.ValidationError as exc: console.print(f"[red]Repository config file not valid, found {len(exc.errors())} error(s)") for error in exc.errors(): loc_str = [str(item) for item in error["loc"]] diff --git a/python_sdk/infrahub_ctl/config.py b/python_sdk/infrahub_ctl/config.py index fae75f1e07..035d68f124 100644 --- a/python_sdk/infrahub_ctl/config.py +++ b/python_sdk/infrahub_ctl/config.py @@ -4,14 +4,18 @@ import toml import typer -from pydantic import BaseSettings, ValidationError, root_validator + +try: + from pydantic import v1 as pydantic +except ImportError: + import pydantic DEFAULT_CONFIG_FILE = "infrahubctl.toml" ENVVAR_CONFIG_FILE = "INFRAHUBCTL_CONFIG" INFRAHUB_REPO_CONFIG_FILE = ".infrahub.yml" -class Settings(BaseSettings): +class Settings(pydantic.BaseSettings): """Main Settings Class for the project.""" server_address: str = "http://localhost:8000" @@ -27,7 +31,7 @@ class Config: "default_branch": {"env": "INFRAHUB_DEFAULT_BRANCH"}, } - @root_validator + @pydantic.root_validator def cleanup_server_address(cls, values: Dict[str, Any]) -> Dict[str, Any]: # pylint: disable=no-self-argument values["server_address"] = values["server_address"].rstrip("/") return values @@ -76,7 +80,7 @@ def load_and_exit( """ try: load(config_file=config_file, config_data=config_data) - except ValidationError as exc: + except pydantic.ValidationError as exc: print(f"Configuration not valid, found {len(exc.errors())} error(s)") for error in exc.errors(): loc_str = [str(item) for item in error["loc"]] diff --git a/python_sdk/infrahub_sdk/branch.py b/python_sdk/infrahub_sdk/branch.py index d21db26304..2596807b15 100644 --- a/python_sdk/infrahub_sdk/branch.py +++ b/python_sdk/infrahub_sdk/branch.py @@ -2,7 +2,10 @@ from typing import TYPE_CHECKING, Any, Dict, Optional, Union -from pydantic import BaseModel +try: + from pydantic import v1 as pydantic +except ImportError: + import pydantic # type: ignore[no-redef] from infrahub_sdk.exceptions import BranchNotFound from infrahub_sdk.graphql import Mutation @@ -12,13 +15,13 @@ from infrahub_sdk.client import InfrahubClient, InfrahubClientSync -class BranchData(BaseModel): +class BranchData(pydantic.BaseModel): id: str name: str - description: Optional[str] + description: Optional[str] = None is_data_only: bool is_default: bool - origin_branch: Optional[str] + origin_branch: Optional[str] = None branched_from: str diff --git a/python_sdk/infrahub_sdk/config.py b/python_sdk/infrahub_sdk/config.py index ddcc319956..1703923203 100644 --- a/python_sdk/infrahub_sdk/config.py +++ b/python_sdk/infrahub_sdk/config.py @@ -1,6 +1,9 @@ from typing import Any, Dict, Optional -from pydantic import BaseSettings, Field, root_validator, validator +try: + from pydantic import v1 as pydantic +except ImportError: + import pydantic # type: ignore[no-redef] from infrahub_sdk.playback import JSONPlayback from infrahub_sdk.recorder import JSONRecorder, Recorder, RecorderType @@ -8,25 +11,27 @@ from infrahub_sdk.utils import is_valid_url -class Config(BaseSettings): - address: str = Field( +class Config(pydantic.BaseSettings): + address: str = pydantic.Field( default="http://localhost:8000", description="The URL to use when connecting to Infrahub.", ) - api_token: Optional[str] = Field(default=None, description="API token for authentication against Infrahub.") - username: Optional[str] = Field(default=None, description="Username for accessing Infrahub", min_length=1) - password: Optional[str] = Field(default=None, description="Password for accessing Infrahub", min_length=1) - recorder: RecorderType = Field( + api_token: Optional[str] = pydantic.Field( + default=None, description="API token for authentication against Infrahub." + ) + username: Optional[str] = pydantic.Field(default=None, description="Username for accessing Infrahub", min_length=1) + password: Optional[str] = pydantic.Field(default=None, description="Password for accessing Infrahub", min_length=1) + recorder: RecorderType = pydantic.Field( default=RecorderType.NONE, description="Select builtin recorder for later replay.", ) - custom_recorder: Optional[Recorder] = Field( + custom_recorder: Optional[Recorder] = pydantic.Field( default=None, description="Provides a way to record responses from the Infrahub API", ) requester: Optional[AsyncRequester] = None - timeout: int = Field(default=10, description="Default connection timeout in seconds") - transport: RequesterTransport = Field( + timeout: int = pydantic.Field(default=10, description="Default connection timeout in seconds") + transport: RequesterTransport = pydantic.Field( default=RequesterTransport.HTTPX, description="Set an alternate transport using a predefined option", ) @@ -37,7 +42,8 @@ class Config: case_sensitive = False validate_assignment = True - @root_validator(pre=True) + @pydantic.root_validator(pre=True) + @classmethod @classmethod def validate_credentials_input(cls, values: Dict[str, Any]) -> Dict[str, Any]: has_username = "username" in values @@ -46,14 +52,16 @@ def validate_credentials_input(cls, values: Dict[str, Any]) -> Dict[str, Any]: raise ValueError("Both 'username' and 'password' needs to be set") return values - @root_validator(pre=True) + @pydantic.root_validator(pre=True) + @classmethod @classmethod def set_custom_recorder(cls, values: Dict[str, Any]) -> Dict[str, Any]: if values.get("recorder") == RecorderType.JSON and "custom_recorder" not in values: values["custom_recorder"] = JSONRecorder() return values - @root_validator(pre=True) + @pydantic.root_validator(pre=True) + @classmethod @classmethod def set_transport(cls, values: Dict[str, Any]) -> Dict[str, Any]: if values.get("transport") == RequesterTransport.JSON: @@ -65,14 +73,16 @@ def set_transport(cls, values: Dict[str, Any]) -> Dict[str, Any]: return values - @root_validator(pre=True) + @pydantic.root_validator(pre=True) + @classmethod @classmethod def validate_mix_authentication_schemes(cls, values: Dict[str, Any]) -> Dict[str, Any]: if values.get("password") and values.get("api_token"): raise ValueError("Unable to combine password with token based authentication") return values - @validator("address") + @pydantic.validator("address") + @classmethod @classmethod def validate_address(cls, value: str) -> str: if is_valid_url(value): diff --git a/python_sdk/infrahub_sdk/playback.py b/python_sdk/infrahub_sdk/playback.py index f066231f76..5176ba2fd7 100644 --- a/python_sdk/infrahub_sdk/playback.py +++ b/python_sdk/infrahub_sdk/playback.py @@ -2,14 +2,18 @@ from typing import Any, Dict, Optional import httpx -from pydantic import BaseSettings, Field + +try: + from pydantic import v1 as pydantic +except ImportError: + import pydantic # type: ignore[no-redef] from infrahub_sdk.types import HTTPMethod from infrahub_sdk.utils import generate_request_filename -class JSONPlayback(BaseSettings): - directory: str = Field(default=".", description="Directory to read recorded files from") +class JSONPlayback(pydantic.BaseSettings): + directory: str = pydantic.Field(default=".", description="Directory to read recorded files from") async def async_request( self, diff --git a/python_sdk/infrahub_sdk/recorder.py b/python_sdk/infrahub_sdk/recorder.py index 393575cd7f..89d225fad8 100644 --- a/python_sdk/infrahub_sdk/recorder.py +++ b/python_sdk/infrahub_sdk/recorder.py @@ -3,7 +3,11 @@ from typing import Protocol, runtime_checkable import httpx -from pydantic import BaseSettings + +try: + from pydantic import v1 as pydantic +except ImportError: + import pydantic # type: ignore[no-redef] from infrahub_sdk.utils import generate_request_filename @@ -19,7 +23,7 @@ def record(self, response: httpx.Response) -> None: ... -class JSONRecorder(BaseSettings): +class JSONRecorder(pydantic.BaseSettings): directory: str = "." host: str = "" diff --git a/python_sdk/infrahub_sdk/schema.py b/python_sdk/infrahub_sdk/schema.py index d21dae9e24..b21871eb77 100644 --- a/python_sdk/infrahub_sdk/schema.py +++ b/python_sdk/infrahub_sdk/schema.py @@ -14,7 +14,10 @@ Union, ) -from pydantic import BaseModel, Field +try: + from pydantic import v1 as pydantic +except ImportError: + import pydantic # type: ignore[no-redef] from infrahub_sdk.exceptions import SchemaNotFound, ValidationError @@ -27,25 +30,25 @@ # --------------------------------------------------------------------------------- # Repository Configuration file # --------------------------------------------------------------------------------- -class InfrahubRepositoryRFileConfig(BaseModel): +class InfrahubRepositoryRFileConfig(pydantic.BaseModel): name: str query: str repository: str template_path: Path -class InfrahubRepositoryConfig(BaseModel): - schemas: List[Path] = Field(default_factory=list) - rfiles: Optional[List[InfrahubRepositoryRFileConfig]] +class InfrahubRepositoryConfig(pydantic.BaseModel): + schemas: List[Path] = pydantic.Field(default_factory=list) + rfiles: Optional[List[InfrahubRepositoryRFileConfig]] = pydantic.Field(default_factory=list) # --------------------------------------------------------------------------------- # Main Infrahub Schema File # --------------------------------------------------------------------------------- -class FilterSchema(BaseModel): +class FilterSchema(pydantic.BaseModel): name: str kind: str - description: Optional[str] + description: Optional[str] = None class RelationshipCardinality(str, Enum): @@ -67,40 +70,40 @@ class RelationshipKind(str, Enum): GROUP = "Group" -class AttributeSchema(BaseModel): +class AttributeSchema(pydantic.BaseModel): name: str kind: str - label: Optional[str] - description: Optional[str] - default_value: Optional[Any] + label: Optional[str] = None + description: Optional[str] = None + default_value: Optional[Any] = None inherited: bool = False unique: bool = False - branch: Optional[BranchSupportType] + branch: Optional[BranchSupportType] = None optional: bool = False -class RelationshipSchema(BaseModel): +class RelationshipSchema(pydantic.BaseModel): name: str peer: str kind: RelationshipKind = RelationshipKind.GENERIC - label: Optional[str] - description: Optional[str] - identifier: Optional[str] + label: Optional[str] = None + description: Optional[str] = None + identifier: Optional[str] = None inherited: bool = False cardinality: str = "many" - branch: Optional[BranchSupportType] + branch: Optional[BranchSupportType] = None optional: bool = True - filters: List[FilterSchema] = Field(default_factory=list) + filters: List[FilterSchema] = pydantic.Field(default_factory=list) -class BaseNodeSchema(BaseModel): +class BaseNodeSchema(pydantic.BaseModel): name: str - label: Optional[str] + label: Optional[str] = None namespace: str - description: Optional[str] - attributes: List[AttributeSchema] = Field(default_factory=list) - relationships: List[RelationshipSchema] = Field(default_factory=list) - filters: List[FilterSchema] = Field(default_factory=list) + description: Optional[str] = None + attributes: List[AttributeSchema] = pydantic.Field(default_factory=list) + relationships: List[RelationshipSchema] = pydantic.Field(default_factory=list) + filters: List[FilterSchema] = pydantic.Field(default_factory=list) @property def kind(self) -> str: @@ -184,41 +187,41 @@ def unique_attributes(self) -> List[AttributeSchema]: class GenericSchema(BaseNodeSchema): """A Generic can be either an Interface or a Union depending if there are some Attributes or Relationships defined.""" - used_by: List[str] = Field(default_factory=list) + used_by: List[str] = pydantic.Field(default_factory=list) class NodeSchema(BaseNodeSchema): - inherit_from: List[str] = Field(default_factory=list) - groups: List[str] = Field(default_factory=list) - branch: Optional[BranchSupportType] - default_filter: Optional[str] + inherit_from: List[str] = pydantic.Field(default_factory=list) + groups: List[str] = pydantic.Field(default_factory=list) + branch: Optional[BranchSupportType] = None + default_filter: Optional[str] = None -class NodeExtensionSchema(BaseModel): - name: Optional[str] +class NodeExtensionSchema(pydantic.BaseModel): + name: Optional[str] = None kind: str - description: Optional[str] - label: Optional[str] - inherit_from: List[str] = Field(default_factory=list) - groups: List[str] = Field(default_factory=list) - branch: Optional[BranchSupportType] - default_filter: Optional[str] - attributes: List[AttributeSchema] = Field(default_factory=list) - relationships: List[RelationshipSchema] = Field(default_factory=list) + description: Optional[str] = None + label: Optional[str] = None + inherit_from: List[str] = pydantic.Field(default_factory=list) + groups: List[str] = pydantic.Field(default_factory=list) + branch: Optional[BranchSupportType] = None + default_filter: Optional[str] = None + attributes: List[AttributeSchema] = pydantic.Field(default_factory=list) + relationships: List[RelationshipSchema] = pydantic.Field(default_factory=list) -class GroupSchema(BaseModel): +class GroupSchema(pydantic.BaseModel): name: str kind: str - description: Optional[str] + description: Optional[str] = None -class SchemaRoot(BaseModel): +class SchemaRoot(pydantic.BaseModel): version: str - generics: List[GenericSchema] = Field(default_factory=list) - nodes: List[NodeSchema] = Field(default_factory=list) - groups: List[GroupSchema] = Field(default_factory=list) - # node_extensions: List[NodeExtensionSchema] = Field(default_factory=list) + generics: List[GenericSchema] = pydantic.Field(default_factory=list) + nodes: List[NodeSchema] = pydantic.Field(default_factory=list) + groups: List[GroupSchema] = pydantic.Field(default_factory=list) + # node_extensions: List[NodeExtensionSchema] = pydantic.Field(default_factory=list) class InfrahubSchemaBase: diff --git a/python_sdk/poetry.lock b/python_sdk/poetry.lock index 522a549fd0..b3285b5b7d 100644 --- a/python_sdk/poetry.lock +++ b/python_sdk/poetry.lock @@ -1,5 +1,19 @@ # This file is automatically @generated by Poetry 1.6.1 and should not be changed by hand. +[[package]] +name = "annotated-types" +version = "0.6.0" +description = "Reusable constraint types to use with typing.Annotated" +optional = false +python-versions = ">=3.8" +files = [ + {file = "annotated_types-0.6.0-py3-none-any.whl", hash = "sha256:0641064de18ba7a25dee8f96403ebc39113d0cb953a01429249d5c7564666a43"}, + {file = "annotated_types-0.6.0.tar.gz", hash = "sha256:563339e807e53ffd9c267e99fc6d9ea23eb8443c08f112651963e24e22f84a5d"}, +] + +[package.dependencies] +typing-extensions = {version = ">=4.0.0", markers = "python_version < \"3.9\""} + [[package]] name = "anyio" version = "4.0.0" @@ -21,6 +35,17 @@ doc = ["Sphinx (>=7)", "packaging", "sphinx-autodoc-typehints (>=1.2.0)"] test = ["anyio[trio]", "coverage[toml] (>=7)", "hypothesis (>=4.0)", "psutil (>=5.9)", "pytest (>=7.0)", "pytest-mock (>=3.6.1)", "trustme", "uvloop (>=0.17)"] trio = ["trio (>=0.22)"] +[[package]] +name = "appdirs" +version = "1.4.4" +description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"." +optional = false +python-versions = "*" +files = [ + {file = "appdirs-1.4.4-py2.py3-none-any.whl", hash = "sha256:a841dacd6b99318a741b166adb07e19ee71a274450e68237b4650ca1055ab128"}, + {file = "appdirs-1.4.4.tar.gz", hash = "sha256:7d5d0167b2b1ba821647616af46a749d1c653740dd0d2415100fe26e27afdf41"}, +] + [[package]] name = "appnope" version = "0.1.3" @@ -352,13 +377,13 @@ files = [ [[package]] name = "deepdiff" -version = "6.6.1" +version = "6.7.1" description = "Deep Difference and Search of any Python object/data. Recreate objects by adding adding deltas to each other." optional = false python-versions = ">=3.7" files = [ - {file = "deepdiff-6.6.1-py3-none-any.whl", hash = "sha256:891b3cb12837e5d376ac0b58f4c8a2764e3a8bbceabb7108ff82235f1f2c4460"}, - {file = "deepdiff-6.6.1.tar.gz", hash = "sha256:75c75b1511f0e48edef2b70d785a9c32b2631666b465fa8c32270a77a7b950b5"}, + {file = "deepdiff-6.7.1-py3-none-any.whl", hash = "sha256:58396bb7a863cbb4ed5193f548c56f18218060362311aa1dc36397b2f25108bd"}, + {file = "deepdiff-6.7.1.tar.gz", hash = "sha256:b367e6fa6caac1c9f500adc79ada1b5b1242c50d5f716a1a4362030197847d30"}, ] [package.dependencies] @@ -495,13 +520,13 @@ files = [ [[package]] name = "httpcore" -version = "0.17.3" +version = "0.16.3" description = "A minimal low-level HTTP client." optional = false python-versions = ">=3.7" files = [ - {file = "httpcore-0.17.3-py3-none-any.whl", hash = "sha256:c2789b767ddddfa2a5782e3199b2b7f6894540b17b16ec26b2c4d8e103510b87"}, - {file = "httpcore-0.17.3.tar.gz", hash = "sha256:a6f30213335e34c1ade7be6ec7c47f19f50c56db36abef1a9dfa3815b1cb3888"}, + {file = "httpcore-0.16.3-py3-none-any.whl", hash = "sha256:da1fb708784a938aa084bde4feb8317056c55037247c787bd7e19eb2c2949dc0"}, + {file = "httpcore-0.16.3.tar.gz", hash = "sha256:c5d6f04e2fc530f39e0c077e6a30caa53f1451096120f1f38b954afd0b17c0cb"}, ] [package.dependencies] @@ -516,24 +541,24 @@ socks = ["socksio (==1.*)"] [[package]] name = "httpx" -version = "0.24.1" +version = "0.23.3" description = "The next generation HTTP client." optional = false python-versions = ">=3.7" files = [ - {file = "httpx-0.24.1-py3-none-any.whl", hash = "sha256:06781eb9ac53cde990577af654bd990a4949de37a28bdb4a230d434f3a30b9bd"}, - {file = "httpx-0.24.1.tar.gz", hash = "sha256:5853a43053df830c20f8110c5e69fe44d035d850b2dfe795e196f00fdb774bdd"}, + {file = "httpx-0.23.3-py3-none-any.whl", hash = "sha256:a211fcce9b1254ea24f0cd6af9869b3d29aba40154e947d2a07bb499b3e310d6"}, + {file = "httpx-0.23.3.tar.gz", hash = "sha256:9818458eb565bb54898ccb9b8b251a28785dd4a55afbc23d0eb410754fe7d0f9"}, ] [package.dependencies] certifi = "*" -httpcore = ">=0.15.0,<0.18.0" -idna = "*" +httpcore = ">=0.15.0,<0.17.0" +rfc3986 = {version = ">=1.3,<2", extras = ["idna2008"]} sniffio = "*" [package.extras] brotli = ["brotli", "brotlicffi"] -cli = ["click (==8.*)", "pygments (==2.*)", "rich (>=10,<14)"] +cli = ["click (==8.*)", "pygments (==2.*)", "rich (>=10,<13)"] http2 = ["h2 (>=3,<5)"] socks = ["socksio (==1.*)"] @@ -730,6 +755,16 @@ files = [ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"}, @@ -800,38 +835,38 @@ files = [ [[package]] name = "mypy" -version = "1.6.1" +version = "1.7.0" description = "Optional static typing for Python" optional = false python-versions = ">=3.8" files = [ - {file = "mypy-1.6.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e5012e5cc2ac628177eaac0e83d622b2dd499e28253d4107a08ecc59ede3fc2c"}, - {file = "mypy-1.6.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d8fbb68711905f8912e5af474ca8b78d077447d8f3918997fecbf26943ff3cbb"}, - {file = "mypy-1.6.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21a1ad938fee7d2d96ca666c77b7c494c3c5bd88dff792220e1afbebb2925b5e"}, - {file = "mypy-1.6.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:b96ae2c1279d1065413965c607712006205a9ac541895004a1e0d4f281f2ff9f"}, - {file = "mypy-1.6.1-cp310-cp310-win_amd64.whl", hash = "sha256:40b1844d2e8b232ed92e50a4bd11c48d2daa351f9deee6c194b83bf03e418b0c"}, - {file = "mypy-1.6.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:81af8adaa5e3099469e7623436881eff6b3b06db5ef75e6f5b6d4871263547e5"}, - {file = "mypy-1.6.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8c223fa57cb154c7eab5156856c231c3f5eace1e0bed9b32a24696b7ba3c3245"}, - {file = "mypy-1.6.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a8032e00ce71c3ceb93eeba63963b864bf635a18f6c0c12da6c13c450eedb183"}, - {file = "mypy-1.6.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:4c46b51de523817a0045b150ed11b56f9fff55f12b9edd0f3ed35b15a2809de0"}, - {file = "mypy-1.6.1-cp311-cp311-win_amd64.whl", hash = "sha256:19f905bcfd9e167159b3d63ecd8cb5e696151c3e59a1742e79bc3bcb540c42c7"}, - {file = "mypy-1.6.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:82e469518d3e9a321912955cc702d418773a2fd1e91c651280a1bda10622f02f"}, - {file = "mypy-1.6.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:d4473c22cc296425bbbce7e9429588e76e05bc7342da359d6520b6427bf76660"}, - {file = "mypy-1.6.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:59a0d7d24dfb26729e0a068639a6ce3500e31d6655df8557156c51c1cb874ce7"}, - {file = "mypy-1.6.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:cfd13d47b29ed3bbaafaff7d8b21e90d827631afda134836962011acb5904b71"}, - {file = "mypy-1.6.1-cp312-cp312-win_amd64.whl", hash = "sha256:eb4f18589d196a4cbe5290b435d135dee96567e07c2b2d43b5c4621b6501531a"}, - {file = "mypy-1.6.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:41697773aa0bf53ff917aa077e2cde7aa50254f28750f9b88884acea38a16169"}, - {file = "mypy-1.6.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7274b0c57737bd3476d2229c6389b2ec9eefeb090bbaf77777e9d6b1b5a9d143"}, - {file = "mypy-1.6.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbaf4662e498c8c2e352da5f5bca5ab29d378895fa2d980630656178bd607c46"}, - {file = "mypy-1.6.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:bb8ccb4724f7d8601938571bf3f24da0da791fe2db7be3d9e79849cb64e0ae85"}, - {file = "mypy-1.6.1-cp38-cp38-win_amd64.whl", hash = "sha256:68351911e85145f582b5aa6cd9ad666c8958bcae897a1bfda8f4940472463c45"}, - {file = "mypy-1.6.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:49ae115da099dcc0922a7a895c1eec82c1518109ea5c162ed50e3b3594c71208"}, - {file = "mypy-1.6.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8b27958f8c76bed8edaa63da0739d76e4e9ad4ed325c814f9b3851425582a3cd"}, - {file = "mypy-1.6.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:925cd6a3b7b55dfba252b7c4561892311c5358c6b5a601847015a1ad4eb7d332"}, - {file = "mypy-1.6.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8f57e6b6927a49550da3d122f0cb983d400f843a8a82e65b3b380d3d7259468f"}, - {file = "mypy-1.6.1-cp39-cp39-win_amd64.whl", hash = "sha256:a43ef1c8ddfdb9575691720b6352761f3f53d85f1b57d7745701041053deff30"}, - {file = "mypy-1.6.1-py3-none-any.whl", hash = "sha256:4cbe68ef919c28ea561165206a2dcb68591c50f3bcf777932323bc208d949cf1"}, - {file = "mypy-1.6.1.tar.gz", hash = "sha256:4d01c00d09a0be62a4ca3f933e315455bde83f37f892ba4b08ce92f3cf44bcc1"}, + {file = "mypy-1.7.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:5da84d7bf257fd8f66b4f759a904fd2c5a765f70d8b52dde62b521972a0a2357"}, + {file = "mypy-1.7.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a3637c03f4025f6405737570d6cbfa4f1400eb3c649317634d273687a09ffc2f"}, + {file = "mypy-1.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b633f188fc5ae1b6edca39dae566974d7ef4e9aaaae00bc36efe1f855e5173ac"}, + {file = "mypy-1.7.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d6ed9a3997b90c6f891138e3f83fb8f475c74db4ccaa942a1c7bf99e83a989a1"}, + {file = "mypy-1.7.0-cp310-cp310-win_amd64.whl", hash = "sha256:1fe46e96ae319df21359c8db77e1aecac8e5949da4773c0274c0ef3d8d1268a9"}, + {file = "mypy-1.7.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:df67fbeb666ee8828f675fee724cc2cbd2e4828cc3df56703e02fe6a421b7401"}, + {file = "mypy-1.7.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a79cdc12a02eb526d808a32a934c6fe6df07b05f3573d210e41808020aed8b5d"}, + {file = "mypy-1.7.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f65f385a6f43211effe8c682e8ec3f55d79391f70a201575def73d08db68ead1"}, + {file = "mypy-1.7.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:0e81ffd120ee24959b449b647c4b2fbfcf8acf3465e082b8d58fd6c4c2b27e46"}, + {file = "mypy-1.7.0-cp311-cp311-win_amd64.whl", hash = "sha256:f29386804c3577c83d76520abf18cfcd7d68264c7e431c5907d250ab502658ee"}, + {file = "mypy-1.7.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:87c076c174e2c7ef8ab416c4e252d94c08cd4980a10967754f91571070bf5fbe"}, + {file = "mypy-1.7.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6cb8d5f6d0fcd9e708bb190b224089e45902cacef6f6915481806b0c77f7786d"}, + {file = "mypy-1.7.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d93e76c2256aa50d9c82a88e2f569232e9862c9982095f6d54e13509f01222fc"}, + {file = "mypy-1.7.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:cddee95dea7990e2215576fae95f6b78a8c12f4c089d7e4367564704e99118d3"}, + {file = "mypy-1.7.0-cp312-cp312-win_amd64.whl", hash = "sha256:d01921dbd691c4061a3e2ecdbfbfad029410c5c2b1ee88946bf45c62c6c91210"}, + {file = "mypy-1.7.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:185cff9b9a7fec1f9f7d8352dff8a4c713b2e3eea9c6c4b5ff7f0edf46b91e41"}, + {file = "mypy-1.7.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7a7b1e399c47b18feb6f8ad4a3eef3813e28c1e871ea7d4ea5d444b2ac03c418"}, + {file = "mypy-1.7.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fc9fe455ad58a20ec68599139ed1113b21f977b536a91b42bef3ffed5cce7391"}, + {file = "mypy-1.7.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d0fa29919d2e720c8dbaf07d5578f93d7b313c3e9954c8ec05b6d83da592e5d9"}, + {file = "mypy-1.7.0-cp38-cp38-win_amd64.whl", hash = "sha256:2b53655a295c1ed1af9e96b462a736bf083adba7b314ae775563e3fb4e6795f5"}, + {file = "mypy-1.7.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c1b06b4b109e342f7dccc9efda965fc3970a604db70f8560ddfdee7ef19afb05"}, + {file = "mypy-1.7.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:bf7a2f0a6907f231d5e41adba1a82d7d88cf1f61a70335889412dec99feeb0f8"}, + {file = "mypy-1.7.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:551d4a0cdcbd1d2cccdcc7cb516bb4ae888794929f5b040bb51aae1846062901"}, + {file = "mypy-1.7.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:55d28d7963bef00c330cb6461db80b0b72afe2f3c4e2963c99517cf06454e665"}, + {file = "mypy-1.7.0-cp39-cp39-win_amd64.whl", hash = "sha256:870bd1ffc8a5862e593185a4c169804f2744112b4a7c55b93eb50f48e7a77010"}, + {file = "mypy-1.7.0-py3-none-any.whl", hash = "sha256:96650d9a4c651bc2a4991cf46f100973f656d69edc7faf91844e87fe627f7e96"}, + {file = "mypy-1.7.0.tar.gz", hash = "sha256:1e280b5697202efa698372d2f39e9a6713a0395a756b1c6bd48995f8d72690dc"}, ] [package.dependencies] @@ -842,6 +877,7 @@ typing-extensions = ">=4.1.0" [package.extras] dmypy = ["psutil (>=4.0)"] install-types = ["pip"] +mypyc = ["setuptools (>=50)"] reports = ["lxml"] [[package]] @@ -981,13 +1017,13 @@ files = [ [[package]] name = "platformdirs" -version = "3.11.0" +version = "4.0.0" description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"." optional = false python-versions = ">=3.7" files = [ - {file = "platformdirs-3.11.0-py3-none-any.whl", hash = "sha256:e9d171d00af68be50e9202731309c4e658fd8bc76f55c11c7dd760d023bda68e"}, - {file = "platformdirs-3.11.0.tar.gz", hash = "sha256:cf8ee52a3afdb965072dcc652433e0c7e3e40cf5ea1477cd4b3b1d2eb75495b3"}, + {file = "platformdirs-4.0.0-py3-none-any.whl", hash = "sha256:118c954d7e949b35437270383a3f2531e99dd93cf7ce4dc8340d3356d30f173b"}, + {file = "platformdirs-4.0.0.tar.gz", hash = "sha256:cb633b2bcf10c51af60beb0ab06d2f1d69064b43abf4c185ca6b28865f3f9731"}, ] [package.extras] @@ -1022,13 +1058,13 @@ files = [ [[package]] name = "pre-commit" -version = "2.21.0" +version = "2.20.0" description = "A framework for managing and maintaining multi-language pre-commit hooks." optional = false python-versions = ">=3.7" files = [ - {file = "pre_commit-2.21.0-py2.py3-none-any.whl", hash = "sha256:e2f91727039fc39a92f58a588a25b87f936de6567eed4f0e673e0507edc75bad"}, - {file = "pre_commit-2.21.0.tar.gz", hash = "sha256:31ef31af7e474a8d8995027fefdfcf509b5c913ff31f2015b4ec4beb26a6f658"}, + {file = "pre_commit-2.20.0-py2.py3-none-any.whl", hash = "sha256:51a5ba7c480ae8072ecdb6933df22d2f812dc897d5fe848778116129a681aac7"}, + {file = "pre_commit-2.20.0.tar.gz", hash = "sha256:a978dac7bc9ec0bcee55c18a277d553b0f419d259dadb4b9418ff2d00eb43959"}, ] [package.dependencies] @@ -1036,17 +1072,18 @@ cfgv = ">=2.0.0" identify = ">=1.0.0" nodeenv = ">=0.11.1" pyyaml = ">=5.1" -virtualenv = ">=20.10.0" +toml = "*" +virtualenv = ">=20.0.8" [[package]] name = "prompt-toolkit" -version = "3.0.39" +version = "3.0.41" description = "Library for building powerful interactive command lines in Python" optional = false python-versions = ">=3.7.0" files = [ - {file = "prompt_toolkit-3.0.39-py3-none-any.whl", hash = "sha256:9dffbe1d8acf91e3de75f3b544e4842382fc06c6babe903ac9acb74dc6e08d88"}, - {file = "prompt_toolkit-3.0.39.tar.gz", hash = "sha256:04505ade687dc26dc4284b1ad19a83be2f2afe83e7a828ace0c72f3a1df72aac"}, + {file = "prompt_toolkit-3.0.41-py3-none-any.whl", hash = "sha256:f36fe301fafb7470e86aaf90f036eef600a3210be4decf461a5b1ca8403d3cb2"}, + {file = "prompt_toolkit-3.0.41.tar.gz", hash = "sha256:941367d97fc815548822aa26c2a269fdc4eb21e9ec05fc5d447cf09bad5d75f0"}, ] [package.dependencies] @@ -1079,55 +1116,154 @@ tests = ["pytest"] [[package]] name = "pydantic" -version = "1.10.13" -description = "Data validation and settings management using python type hints" +version = "2.5.1" +description = "Data validation using Python type hints" optional = false python-versions = ">=3.7" files = [ - {file = "pydantic-1.10.13-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:efff03cc7a4f29d9009d1c96ceb1e7a70a65cfe86e89d34e4a5f2ab1e5693737"}, - {file = "pydantic-1.10.13-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:3ecea2b9d80e5333303eeb77e180b90e95eea8f765d08c3d278cd56b00345d01"}, - {file = "pydantic-1.10.13-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1740068fd8e2ef6eb27a20e5651df000978edce6da6803c2bef0bc74540f9548"}, - {file = "pydantic-1.10.13-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:84bafe2e60b5e78bc64a2941b4c071a4b7404c5c907f5f5a99b0139781e69ed8"}, - {file = "pydantic-1.10.13-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:bc0898c12f8e9c97f6cd44c0ed70d55749eaf783716896960b4ecce2edfd2d69"}, - {file = "pydantic-1.10.13-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:654db58ae399fe6434e55325a2c3e959836bd17a6f6a0b6ca8107ea0571d2e17"}, - {file = "pydantic-1.10.13-cp310-cp310-win_amd64.whl", hash = "sha256:75ac15385a3534d887a99c713aa3da88a30fbd6204a5cd0dc4dab3d770b9bd2f"}, - {file = "pydantic-1.10.13-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c553f6a156deb868ba38a23cf0df886c63492e9257f60a79c0fd8e7173537653"}, - {file = "pydantic-1.10.13-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:5e08865bc6464df8c7d61439ef4439829e3ab62ab1669cddea8dd00cd74b9ffe"}, - {file = "pydantic-1.10.13-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e31647d85a2013d926ce60b84f9dd5300d44535a9941fe825dc349ae1f760df9"}, - {file = "pydantic-1.10.13-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:210ce042e8f6f7c01168b2d84d4c9eb2b009fe7bf572c2266e235edf14bacd80"}, - {file = "pydantic-1.10.13-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:8ae5dd6b721459bfa30805f4c25880e0dd78fc5b5879f9f7a692196ddcb5a580"}, - {file = "pydantic-1.10.13-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:f8e81fc5fb17dae698f52bdd1c4f18b6ca674d7068242b2aff075f588301bbb0"}, - {file = "pydantic-1.10.13-cp311-cp311-win_amd64.whl", hash = "sha256:61d9dce220447fb74f45e73d7ff3b530e25db30192ad8d425166d43c5deb6df0"}, - {file = "pydantic-1.10.13-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:4b03e42ec20286f052490423682016fd80fda830d8e4119f8ab13ec7464c0132"}, - {file = "pydantic-1.10.13-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f59ef915cac80275245824e9d771ee939133be38215555e9dc90c6cb148aaeb5"}, - {file = "pydantic-1.10.13-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5a1f9f747851338933942db7af7b6ee8268568ef2ed86c4185c6ef4402e80ba8"}, - {file = "pydantic-1.10.13-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:97cce3ae7341f7620a0ba5ef6cf043975cd9d2b81f3aa5f4ea37928269bc1b87"}, - {file = "pydantic-1.10.13-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:854223752ba81e3abf663d685f105c64150873cc6f5d0c01d3e3220bcff7d36f"}, - {file = "pydantic-1.10.13-cp37-cp37m-win_amd64.whl", hash = "sha256:b97c1fac8c49be29486df85968682b0afa77e1b809aff74b83081cc115e52f33"}, - {file = "pydantic-1.10.13-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c958d053453a1c4b1c2062b05cd42d9d5c8eb67537b8d5a7e3c3032943ecd261"}, - {file = "pydantic-1.10.13-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4c5370a7edaac06daee3af1c8b1192e305bc102abcbf2a92374b5bc793818599"}, - {file = "pydantic-1.10.13-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7d6f6e7305244bddb4414ba7094ce910560c907bdfa3501e9db1a7fd7eaea127"}, - {file = "pydantic-1.10.13-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d3a3c792a58e1622667a2837512099eac62490cdfd63bd407993aaf200a4cf1f"}, - {file = "pydantic-1.10.13-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:c636925f38b8db208e09d344c7aa4f29a86bb9947495dd6b6d376ad10334fb78"}, - {file = "pydantic-1.10.13-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:678bcf5591b63cc917100dc50ab6caebe597ac67e8c9ccb75e698f66038ea953"}, - {file = "pydantic-1.10.13-cp38-cp38-win_amd64.whl", hash = "sha256:6cf25c1a65c27923a17b3da28a0bdb99f62ee04230c931d83e888012851f4e7f"}, - {file = "pydantic-1.10.13-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:8ef467901d7a41fa0ca6db9ae3ec0021e3f657ce2c208e98cd511f3161c762c6"}, - {file = "pydantic-1.10.13-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:968ac42970f57b8344ee08837b62f6ee6f53c33f603547a55571c954a4225691"}, - {file = "pydantic-1.10.13-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9849f031cf8a2f0a928fe885e5a04b08006d6d41876b8bbd2fc68a18f9f2e3fd"}, - {file = "pydantic-1.10.13-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:56e3ff861c3b9c6857579de282ce8baabf443f42ffba355bf070770ed63e11e1"}, - {file = "pydantic-1.10.13-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f00790179497767aae6bcdc36355792c79e7bbb20b145ff449700eb076c5f96"}, - {file = "pydantic-1.10.13-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:75b297827b59bc229cac1a23a2f7a4ac0031068e5be0ce385be1462e7e17a35d"}, - {file = "pydantic-1.10.13-cp39-cp39-win_amd64.whl", hash = "sha256:e70ca129d2053fb8b728ee7d1af8e553a928d7e301a311094b8a0501adc8763d"}, - {file = "pydantic-1.10.13-py3-none-any.whl", hash = "sha256:b87326822e71bd5f313e7d3bfdc77ac3247035ac10b0c0618bd99dcf95b1e687"}, - {file = "pydantic-1.10.13.tar.gz", hash = "sha256:32c8b48dcd3b2ac4e78b0ba4af3a2c2eb6048cb75202f0ea7b34feb740efc340"}, + {file = "pydantic-2.5.1-py3-none-any.whl", hash = "sha256:dc5244a8939e0d9a68f1f1b5f550b2e1c879912033b1becbedb315accc75441b"}, + {file = "pydantic-2.5.1.tar.gz", hash = "sha256:0b8be5413c06aadfbe56f6dc1d45c9ed25fd43264414c571135c97dd77c2bedb"}, ] [package.dependencies] -typing-extensions = ">=4.2.0" +annotated-types = ">=0.4.0" +pydantic-core = "2.14.3" +typing-extensions = ">=4.6.1" [package.extras] -dotenv = ["python-dotenv (>=0.10.4)"] -email = ["email-validator (>=1.0.3)"] +email = ["email-validator (>=2.0.0)"] + +[[package]] +name = "pydantic-core" +version = "2.14.3" +description = "" +optional = false +python-versions = ">=3.7" +files = [ + {file = "pydantic_core-2.14.3-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ba44fad1d114539d6a1509966b20b74d2dec9a5b0ee12dd7fd0a1bb7b8785e5f"}, + {file = "pydantic_core-2.14.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4a70d23eedd88a6484aa79a732a90e36701048a1509078d1b59578ef0ea2cdf5"}, + {file = "pydantic_core-2.14.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7cc24728a1a9cef497697e53b3d085fb4d3bc0ef1ef4d9b424d9cf808f52c146"}, + {file = "pydantic_core-2.14.3-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ab4a2381005769a4af2ffddae74d769e8a4aae42e970596208ec6d615c6fb080"}, + {file = "pydantic_core-2.14.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:905a12bf088d6fa20e094f9a477bf84bd823651d8b8384f59bcd50eaa92e6a52"}, + {file = "pydantic_core-2.14.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:38aed5a1bbc3025859f56d6a32f6e53ca173283cb95348e03480f333b1091e7d"}, + {file = "pydantic_core-2.14.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1767bd3f6370458e60c1d3d7b1d9c2751cc1ad743434e8ec84625a610c8b9195"}, + {file = "pydantic_core-2.14.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:7cb0c397f29688a5bd2c0dbd44451bc44ebb9b22babc90f97db5ec3e5bb69977"}, + {file = "pydantic_core-2.14.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:9ff737f24b34ed26de62d481ef522f233d3c5927279f6b7229de9b0deb3f76b5"}, + {file = "pydantic_core-2.14.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:a1a39fecb5f0b19faee9a8a8176c805ed78ce45d760259a4ff3d21a7daa4dfc1"}, + {file = "pydantic_core-2.14.3-cp310-none-win32.whl", hash = "sha256:ccbf355b7276593c68fa824030e68cb29f630c50e20cb11ebb0ee450ae6b3d08"}, + {file = "pydantic_core-2.14.3-cp310-none-win_amd64.whl", hash = "sha256:536e1f58419e1ec35f6d1310c88496f0d60e4f182cacb773d38076f66a60b149"}, + {file = "pydantic_core-2.14.3-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:f1f46700402312bdc31912f6fc17f5ecaaaa3bafe5487c48f07c800052736289"}, + {file = "pydantic_core-2.14.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:88ec906eb2d92420f5b074f59cf9e50b3bb44f3cb70e6512099fdd4d88c2f87c"}, + {file = "pydantic_core-2.14.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:056ea7cc3c92a7d2a14b5bc9c9fa14efa794d9f05b9794206d089d06d3433dc7"}, + {file = "pydantic_core-2.14.3-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:076edc972b68a66870cec41a4efdd72a6b655c4098a232314b02d2bfa3bfa157"}, + {file = "pydantic_core-2.14.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e71f666c3bf019f2490a47dddb44c3ccea2e69ac882f7495c68dc14d4065eac2"}, + {file = "pydantic_core-2.14.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f518eac285c9632be337323eef9824a856f2680f943a9b68ac41d5f5bad7df7c"}, + {file = "pydantic_core-2.14.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9dbab442a8d9ca918b4ed99db8d89d11b1f067a7dadb642476ad0889560dac79"}, + {file = "pydantic_core-2.14.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0653fb9fc2fa6787f2fa08631314ab7fc8070307bd344bf9471d1b7207c24623"}, + {file = "pydantic_core-2.14.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:c54af5069da58ea643ad34ff32fd6bc4eebb8ae0fef9821cd8919063e0aeeaab"}, + {file = "pydantic_core-2.14.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:cc956f78651778ec1ab105196e90e0e5f5275884793ab67c60938c75bcca3989"}, + {file = "pydantic_core-2.14.3-cp311-none-win32.whl", hash = "sha256:5b73441a1159f1fb37353aaefb9e801ab35a07dd93cb8177504b25a317f4215a"}, + {file = "pydantic_core-2.14.3-cp311-none-win_amd64.whl", hash = "sha256:7349f99f1ef8b940b309179733f2cad2e6037a29560f1b03fdc6aa6be0a8d03c"}, + {file = "pydantic_core-2.14.3-cp311-none-win_arm64.whl", hash = "sha256:ec79dbe23702795944d2ae4c6925e35a075b88acd0d20acde7c77a817ebbce94"}, + {file = "pydantic_core-2.14.3-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:8f5624f0f67f2b9ecaa812e1dfd2e35b256487566585160c6c19268bf2ffeccc"}, + {file = "pydantic_core-2.14.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6c2d118d1b6c9e2d577e215567eedbe11804c3aafa76d39ec1f8bc74e918fd07"}, + {file = "pydantic_core-2.14.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fe863491664c6720d65ae438d4efaa5eca766565a53adb53bf14bc3246c72fe0"}, + {file = "pydantic_core-2.14.3-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:136bc7247e97a921a020abbd6ef3169af97569869cd6eff41b6a15a73c44ea9b"}, + {file = "pydantic_core-2.14.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aeafc7f5bbddc46213707266cadc94439bfa87ecf699444de8be044d6d6eb26f"}, + {file = "pydantic_core-2.14.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e16aaf788f1de5a85c8f8fcc9c1ca1dd7dd52b8ad30a7889ca31c7c7606615b8"}, + {file = "pydantic_core-2.14.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8fc652c354d3362e2932a79d5ac4bbd7170757a41a62c4fe0f057d29f10bebb"}, + {file = "pydantic_core-2.14.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f1b92e72babfd56585c75caf44f0b15258c58e6be23bc33f90885cebffde3400"}, + {file = "pydantic_core-2.14.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:75f3f534f33651b73f4d3a16d0254de096f43737d51e981478d580f4b006b427"}, + {file = "pydantic_core-2.14.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:c9ffd823c46e05ef3eb28b821aa7bc501efa95ba8880b4a1380068e32c5bed47"}, + {file = "pydantic_core-2.14.3-cp312-none-win32.whl", hash = "sha256:12e05a76b223577a4696c76d7a6b36a0ccc491ffb3c6a8cf92d8001d93ddfd63"}, + {file = "pydantic_core-2.14.3-cp312-none-win_amd64.whl", hash = "sha256:1582f01eaf0537a696c846bea92082082b6bfc1103a88e777e983ea9fbdc2a0f"}, + {file = "pydantic_core-2.14.3-cp312-none-win_arm64.whl", hash = "sha256:96fb679c7ca12a512d36d01c174a4fbfd912b5535cc722eb2c010c7b44eceb8e"}, + {file = "pydantic_core-2.14.3-cp37-cp37m-macosx_10_7_x86_64.whl", hash = "sha256:71ed769b58d44e0bc2701aa59eb199b6665c16e8a5b8b4a84db01f71580ec448"}, + {file = "pydantic_core-2.14.3-cp37-cp37m-macosx_11_0_arm64.whl", hash = "sha256:5402ee0f61e7798ea93a01b0489520f2abfd9b57b76b82c93714c4318c66ca06"}, + {file = "pydantic_core-2.14.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eaab9dc009e22726c62fe3b850b797e7f0e7ba76d245284d1064081f512c7226"}, + {file = "pydantic_core-2.14.3-cp37-cp37m-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:92486a04d54987054f8b4405a9af9d482e5100d6fe6374fc3303015983fc8bda"}, + {file = "pydantic_core-2.14.3-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cf08b43d1d5d1678f295f0431a4a7e1707d4652576e1d0f8914b5e0213bfeee5"}, + {file = "pydantic_core-2.14.3-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a8ca13480ce16daad0504be6ce893b0ee8ec34cd43b993b754198a89e2787f7e"}, + {file = "pydantic_core-2.14.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:44afa3c18d45053fe8d8228950ee4c8eaf3b5a7f3b64963fdeac19b8342c987f"}, + {file = "pydantic_core-2.14.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:56814b41486e2d712a8bc02a7b1f17b87fa30999d2323bbd13cf0e52296813a1"}, + {file = "pydantic_core-2.14.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c3dc2920cc96f9aa40c6dc54256e436cc95c0a15562eb7bd579e1811593c377e"}, + {file = "pydantic_core-2.14.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:e483b8b913fcd3b48badec54185c150cb7ab0e6487914b84dc7cde2365e0c892"}, + {file = "pydantic_core-2.14.3-cp37-none-win32.whl", hash = "sha256:364dba61494e48f01ef50ae430e392f67ee1ee27e048daeda0e9d21c3ab2d609"}, + {file = "pydantic_core-2.14.3-cp37-none-win_amd64.whl", hash = "sha256:a402ae1066be594701ac45661278dc4a466fb684258d1a2c434de54971b006ca"}, + {file = "pydantic_core-2.14.3-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:10904368261e4509c091cbcc067e5a88b070ed9a10f7ad78f3029c175487490f"}, + {file = "pydantic_core-2.14.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:260692420028319e201b8649b13ac0988974eeafaaef95d0dfbf7120c38dc000"}, + {file = "pydantic_core-2.14.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3c1bf1a7b05a65d3b37a9adea98e195e0081be6b17ca03a86f92aeb8b110f468"}, + {file = "pydantic_core-2.14.3-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d7abd17a838a52140e3aeca271054e321226f52df7e0a9f0da8f91ea123afe98"}, + {file = "pydantic_core-2.14.3-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a5c51460ede609fbb4fa883a8fe16e749964ddb459966d0518991ec02eb8dfb9"}, + {file = "pydantic_core-2.14.3-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d06c78074646111fb01836585f1198367b17d57c9f427e07aaa9ff499003e58d"}, + {file = "pydantic_core-2.14.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:af452e69446fadf247f18ac5d153b1f7e61ef708f23ce85d8c52833748c58075"}, + {file = "pydantic_core-2.14.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e3ad4968711fb379a67c8c755beb4dae8b721a83737737b7bcee27c05400b047"}, + {file = "pydantic_core-2.14.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:c5ea0153482e5b4d601c25465771c7267c99fddf5d3f3bdc238ef930e6d051cf"}, + {file = "pydantic_core-2.14.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:96eb10ef8920990e703da348bb25fedb8b8653b5966e4e078e5be382b430f9e0"}, + {file = "pydantic_core-2.14.3-cp38-none-win32.whl", hash = "sha256:ea1498ce4491236d1cffa0eee9ad0968b6ecb0c1cd711699c5677fc689905f00"}, + {file = "pydantic_core-2.14.3-cp38-none-win_amd64.whl", hash = "sha256:2bc736725f9bd18a60eec0ed6ef9b06b9785454c8d0105f2be16e4d6274e63d0"}, + {file = "pydantic_core-2.14.3-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:1ea992659c03c3ea811d55fc0a997bec9dde863a617cc7b25cfde69ef32e55af"}, + {file = "pydantic_core-2.14.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:d2b53e1f851a2b406bbb5ac58e16c4a5496038eddd856cc900278fa0da97f3fc"}, + {file = "pydantic_core-2.14.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0c7f8e8a7cf8e81ca7d44bea4f181783630959d41b4b51d2f74bc50f348a090f"}, + {file = "pydantic_core-2.14.3-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3b9c91eeb372a64ec6686c1402afd40cc20f61a0866850f7d989b6bf39a41a"}, + {file = "pydantic_core-2.14.3-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9ef3e2e407e4cad2df3c89488a761ed1f1c33f3b826a2ea9a411b0a7d1cccf1b"}, + {file = "pydantic_core-2.14.3-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f86f20a9d5bee1a6ede0f2757b917bac6908cde0f5ad9fcb3606db1e2968bcf5"}, + {file = "pydantic_core-2.14.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:61beaa79d392d44dc19d6f11ccd824d3cccb865c4372157c40b92533f8d76dd0"}, + {file = "pydantic_core-2.14.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d41df8e10b094640a6b234851b624b76a41552f637b9fb34dc720b9fe4ef3be4"}, + {file = "pydantic_core-2.14.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:2c08ac60c3caa31f825b5dbac47e4875bd4954d8f559650ad9e0b225eaf8ed0c"}, + {file = "pydantic_core-2.14.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:98d8b3932f1a369364606417ded5412c4ffb15bedbcf797c31317e55bd5d920e"}, + {file = "pydantic_core-2.14.3-cp39-none-win32.whl", hash = "sha256:caa94726791e316f0f63049ee00dff3b34a629b0d099f3b594770f7d0d8f1f56"}, + {file = "pydantic_core-2.14.3-cp39-none-win_amd64.whl", hash = "sha256:2494d20e4c22beac30150b4be3b8339bf2a02ab5580fa6553ca274bc08681a65"}, + {file = "pydantic_core-2.14.3-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:fe272a72c7ed29f84c42fedd2d06c2f9858dc0c00dae3b34ba15d6d8ae0fbaaf"}, + {file = "pydantic_core-2.14.3-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:7e63a56eb7fdee1587d62f753ccd6d5fa24fbeea57a40d9d8beaef679a24bdd6"}, + {file = "pydantic_core-2.14.3-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b7692f539a26265cece1e27e366df5b976a6db6b1f825a9e0466395b314ee48b"}, + {file = "pydantic_core-2.14.3-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:af46f0b7a1342b49f208fed31f5a83b8495bb14b652f621e0a6787d2f10f24ee"}, + {file = "pydantic_core-2.14.3-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6e2f9d76c00e805d47f19c7a96a14e4135238a7551a18bfd89bb757993fd0933"}, + {file = "pydantic_core-2.14.3-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:de52ddfa6e10e892d00f747bf7135d7007302ad82e243cf16d89dd77b03b649d"}, + {file = "pydantic_core-2.14.3-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:38113856c7fad8c19be7ddd57df0c3e77b1b2336459cb03ee3903ce9d5e236ce"}, + {file = "pydantic_core-2.14.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:354db020b1f8f11207b35360b92d95725621eb92656725c849a61e4b550f4acc"}, + {file = "pydantic_core-2.14.3-pp37-pypy37_pp73-macosx_10_7_x86_64.whl", hash = "sha256:76fc18653a5c95e5301a52d1b5afb27c9adc77175bf00f73e94f501caf0e05ad"}, + {file = "pydantic_core-2.14.3-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2646f8270f932d79ba61102a15ea19a50ae0d43b314e22b3f8f4b5fabbfa6e38"}, + {file = "pydantic_core-2.14.3-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:37dad73a2f82975ed563d6a277fd9b50e5d9c79910c4aec787e2d63547202315"}, + {file = "pydantic_core-2.14.3-pp37-pypy37_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:113752a55a8eaece2e4ac96bc8817f134c2c23477e477d085ba89e3aa0f4dc44"}, + {file = "pydantic_core-2.14.3-pp37-pypy37_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:8488e973547e8fb1b4193fd9faf5236cf1b7cd5e9e6dc7ff6b4d9afdc4c720cb"}, + {file = "pydantic_core-2.14.3-pp37-pypy37_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:3d1dde10bd9962b1434053239b1d5490fc31a2b02d8950a5f731bc584c7a5a0f"}, + {file = "pydantic_core-2.14.3-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:2c83892c7bf92b91d30faca53bb8ea21f9d7e39f0ae4008ef2c2f91116d0464a"}, + {file = "pydantic_core-2.14.3-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:849cff945284c577c5f621d2df76ca7b60f803cc8663ff01b778ad0af0e39bb9"}, + {file = "pydantic_core-2.14.3-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa89919fbd8a553cd7d03bf23d5bc5deee622e1b5db572121287f0e64979476"}, + {file = "pydantic_core-2.14.3-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf15145b1f8056d12c67255cd3ce5d317cd4450d5ee747760d8d088d85d12a2d"}, + {file = "pydantic_core-2.14.3-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4cc6bb11f4e8e5ed91d78b9880774fbc0856cb226151b0a93b549c2b26a00c19"}, + {file = "pydantic_core-2.14.3-pp38-pypy38_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:832d16f248ca0cc96929139734ec32d21c67669dcf8a9f3f733c85054429c012"}, + {file = "pydantic_core-2.14.3-pp38-pypy38_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:b02b5e1f54c3396c48b665050464803c23c685716eb5d82a1d81bf81b5230da4"}, + {file = "pydantic_core-2.14.3-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:1f2d4516c32255782153e858f9a900ca6deadfb217fd3fb21bb2b60b4e04d04d"}, + {file = "pydantic_core-2.14.3-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0a3e51c2be472b7867eb0c5d025b91400c2b73a0823b89d4303a9097e2ec6655"}, + {file = "pydantic_core-2.14.3-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:df33902464410a1f1a0411a235f0a34e7e129f12cb6340daca0f9d1390f5fe10"}, + {file = "pydantic_core-2.14.3-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27828f0227b54804aac6fb077b6bb48e640b5435fdd7fbf0c274093a7b78b69c"}, + {file = "pydantic_core-2.14.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1e2979dc80246e18e348de51246d4c9b410186ffa3c50e77924bec436b1e36cb"}, + {file = "pydantic_core-2.14.3-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b28996872b48baf829ee75fa06998b607c66a4847ac838e6fd7473a6b2ab68e7"}, + {file = "pydantic_core-2.14.3-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:ca55c9671bb637ce13d18ef352fd32ae7aba21b4402f300a63f1fb1fd18e0364"}, + {file = "pydantic_core-2.14.3-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:aecd5ed096b0e5d93fb0367fd8f417cef38ea30b786f2501f6c34eabd9062c38"}, + {file = "pydantic_core-2.14.3-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:44aaf1a07ad0824e407dafc637a852e9a44d94664293bbe7d8ee549c356c8882"}, + {file = "pydantic_core-2.14.3.tar.gz", hash = "sha256:3ad083df8fe342d4d8d00cc1d3c1a23f0dc84fce416eb301e69f1ddbbe124d3f"}, +] + +[package.dependencies] +typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0" + +[[package]] +name = "pydantic-settings" +version = "2.1.0" +description = "Settings management using Pydantic" +optional = false +python-versions = ">=3.8" +files = [ + {file = "pydantic_settings-2.1.0-py3-none-any.whl", hash = "sha256:7621c0cb5d90d1140d2f0ef557bdf03573aac7035948109adf2574770b77605a"}, + {file = "pydantic_settings-2.1.0.tar.gz", hash = "sha256:26b1492e0a24755626ac5e6d715e9077ab7ad4fb5f19a8b7ed7011d52f36141c"}, +] + +[package.dependencies] +pydantic = ">=2.3.0" +python-dotenv = ">=0.21.0" [[package]] name = "pyflakes" @@ -1259,17 +1395,17 @@ testing = ["fields", "hunter", "process-tests", "pytest-xdist", "six", "virtuale [[package]] name = "pytest-httpx" -version = "0.22.0" +version = "0.21.3" description = "Send responses to httpx." optional = false python-versions = ">=3.7" files = [ - {file = "pytest_httpx-0.22.0-py3-none-any.whl", hash = "sha256:cefb7dcf66a4cb0601b0de05e576cca423b6081f3245e7912a4d84c58fa3eae8"}, - {file = "pytest_httpx-0.22.0.tar.gz", hash = "sha256:3a82797f3a9a14d51e8c6b7fa97524b68b847ee801109c062e696b4744f4431c"}, + {file = "pytest_httpx-0.21.3-py3-none-any.whl", hash = "sha256:50b52b910f6f6cfb0aa65039d6f5bedb6ae3a0c02a98c4a7187543fe437c428a"}, + {file = "pytest_httpx-0.21.3.tar.gz", hash = "sha256:edcb62baceffbd57753c1a7afc4656b0e71e91c7a512e143c0adbac762d979c1"}, ] [package.dependencies] -httpx = "==0.24.*" +httpx = "==0.23.*" pytest = ">=6.0,<8.0" [package.extras] @@ -1277,13 +1413,13 @@ testing = ["pytest-asyncio (==0.20.*)", "pytest-cov (==4.*)"] [[package]] name = "pytest-xdist" -version = "3.3.1" +version = "3.4.0" description = "pytest xdist plugin for distributed testing, most importantly across multiple CPUs" optional = false python-versions = ">=3.7" files = [ - {file = "pytest-xdist-3.3.1.tar.gz", hash = "sha256:d5ee0520eb1b7bcca50a60a518ab7a7707992812c578198f8b44fdfac78e8c93"}, - {file = "pytest_xdist-3.3.1-py3-none-any.whl", hash = "sha256:ff9daa7793569e6a68544850fd3927cd257cc03a7ef76c95e86915355e82b5f2"}, + {file = "pytest-xdist-3.4.0.tar.gz", hash = "sha256:3a94a931dd9e268e0b871a877d09fe2efb6175c2c23d60d56a6001359002b832"}, + {file = "pytest_xdist-3.4.0-py3-none-any.whl", hash = "sha256:e513118bf787677a427e025606f55e95937565e06dfaac8d87f55301e57ae607"}, ] [package.dependencies] @@ -1309,6 +1445,20 @@ files = [ [package.dependencies] six = ">=1.5" +[[package]] +name = "python-dotenv" +version = "1.0.0" +description = "Read key-value pairs from a .env file and set them as environment variables" +optional = false +python-versions = ">=3.8" +files = [ + {file = "python-dotenv-1.0.0.tar.gz", hash = "sha256:a8df96034aae6d2d50a4ebe8216326c61c3eb64836776504fcca410e5937a3ba"}, + {file = "python_dotenv-1.0.0-py3-none-any.whl", hash = "sha256:f5971a9226b701070a4bf2c38c89e5a3f0d64de8debda981d1db98583009122a"}, +] + +[package.extras] +cli = ["click (>=5.0)"] + [[package]] name = "pytzdata" version = "2020.1" @@ -1332,6 +1482,7 @@ files = [ {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"}, + {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"}, {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"}, {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"}, {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"}, @@ -1339,8 +1490,15 @@ files = [ {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"}, + {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"}, {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"}, {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"}, + {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"}, + {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"}, + {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"}, + {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"}, + {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"}, + {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"}, {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"}, @@ -1357,6 +1515,7 @@ files = [ {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"}, + {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"}, {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"}, {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"}, {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"}, @@ -1364,6 +1523,7 @@ files = [ {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"}, + {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"}, {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"}, {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"}, {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"}, @@ -1390,15 +1550,32 @@ urllib3 = ">=1.21.1,<3" socks = ["PySocks (>=1.5.6,!=1.5.7)"] use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] +[[package]] +name = "rfc3986" +version = "1.5.0" +description = "Validating URI References per RFC 3986" +optional = false +python-versions = "*" +files = [ + {file = "rfc3986-1.5.0-py2.py3-none-any.whl", hash = "sha256:a86d6e1f5b1dc238b218b012df0aa79409667bb209e58da56d0b94704e712a97"}, + {file = "rfc3986-1.5.0.tar.gz", hash = "sha256:270aaf10d87d0d4e095063c65bf3ddbc6ee3d0b226328ce21e036f946e421835"}, +] + +[package.dependencies] +idna = {version = "*", optional = true, markers = "extra == \"idna2008\""} + +[package.extras] +idna2008 = ["idna"] + [[package]] name = "rich" -version = "13.6.0" +version = "13.7.0" description = "Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal" optional = false python-versions = ">=3.7.0" files = [ - {file = "rich-13.6.0-py3-none-any.whl", hash = "sha256:2b38e2fe9ca72c9a00170a1a2d20c63c790d0e10ef1fe35eba76e1e7b1d7d245"}, - {file = "rich-13.6.0.tar.gz", hash = "sha256:5c14d22737e6d5084ef4771b62d5d4363165b403455a30a1c8ca39dc7b644bef"}, + {file = "rich-13.7.0-py3-none-any.whl", hash = "sha256:6da14c108c4866ee9520bbffa71f6fe3962e193b7da68720583850cd4548e235"}, + {file = "rich-13.7.0.tar.gz", hash = "sha256:5cb5123b5cf9ee70584244246816e9114227e0b98ad9176eede6ad54bf5403fa"}, ] [package.dependencies] @@ -1538,13 +1715,13 @@ files = [ [[package]] name = "tomlkit" -version = "0.12.1" +version = "0.12.3" description = "Style preserving TOML library" optional = false python-versions = ">=3.7" files = [ - {file = "tomlkit-0.12.1-py3-none-any.whl", hash = "sha256:712cbd236609acc6a3e2e97253dfc52d4c2082982a88f61b640ecf0817eab899"}, - {file = "tomlkit-0.12.1.tar.gz", hash = "sha256:38e1ff8edb991273ec9f6181244a6a391ac30e9f5098e7535640ea6be97a7c86"}, + {file = "tomlkit-0.12.3-py3-none-any.whl", hash = "sha256:b0a645a9156dc7cb5d3a1f0d4bab66db287fcb8e0430bdd4664a095ea16414ba"}, + {file = "tomlkit-0.12.3.tar.gz", hash = "sha256:75baf5012d06501f07bee5bf8e801b9f343e7aac5a92581f20f80ce632e6b5a4"}, ] [[package]] @@ -1725,61 +1902,61 @@ files = [ [[package]] name = "urllib3" -version = "2.0.7" +version = "2.1.0" description = "HTTP library with thread-safe connection pooling, file post, and more." optional = false -python-versions = ">=3.7" +python-versions = ">=3.8" files = [ - {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"}, - {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"}, + {file = "urllib3-2.1.0-py3-none-any.whl", hash = "sha256:55901e917a5896a349ff771be919f8bd99aff50b79fe58fec595eb37bbc56bb3"}, + {file = "urllib3-2.1.0.tar.gz", hash = "sha256:df7aa8afb0148fa78488e7899b2c59b5f4ffcfa82e6c54ccb9dd37c1d7b52d54"}, ] [package.extras] brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"] -secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"] socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"] zstd = ["zstandard (>=0.18.0)"] [[package]] name = "virtualenv" -version = "20.24.6" +version = "20.4.7" description = "Virtual Python Environment builder" optional = false -python-versions = ">=3.7" +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,>=2.7" files = [ - {file = "virtualenv-20.24.6-py3-none-any.whl", hash = "sha256:520d056652454c5098a00c0f073611ccbea4c79089331f60bf9d7ba247bb7381"}, - {file = "virtualenv-20.24.6.tar.gz", hash = "sha256:02ece4f56fbf939dbbc33c0715159951d6bf14aaf5457b092e4548e1382455af"}, + {file = "virtualenv-20.4.7-py2.py3-none-any.whl", hash = "sha256:2b0126166ea7c9c3661f5b8e06773d28f83322de7a3ff7d06f0aed18c9de6a76"}, + {file = "virtualenv-20.4.7.tar.gz", hash = "sha256:14fdf849f80dbb29a4eb6caa9875d476ee2a5cf76a5f5415fa2f1606010ab467"}, ] [package.dependencies] -distlib = ">=0.3.7,<1" -filelock = ">=3.12.2,<4" -platformdirs = ">=3.9.1,<4" +appdirs = ">=1.4.3,<2" +distlib = ">=0.3.1,<1" +filelock = ">=3.0.0,<4" +six = ">=1.9.0,<2" [package.extras] -docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.2)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=23.6)"] -test = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.4)", "pytest-env (>=0.8.2)", "pytest-freezer (>=0.4.8)", "pytest-mock (>=3.11.1)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)", "setuptools (>=68)", "time-machine (>=2.10)"] +docs = ["proselint (>=0.10.2)", "sphinx (>=3)", "sphinx-argparse (>=0.2.5)", "sphinx-rtd-theme (>=0.4.3)", "towncrier (>=19.9.0rc1)"] +testing = ["coverage (>=4)", "coverage-enable-subprocess (>=1)", "flaky (>=3)", "packaging (>=20.0)", "pytest (>=4)", "pytest-env (>=0.6.2)", "pytest-freezegun (>=0.4.1)", "pytest-mock (>=2)", "pytest-randomly (>=1)", "pytest-timeout (>=1)", "xonsh (>=0.9.16)"] [[package]] name = "wcwidth" -version = "0.2.9" +version = "0.2.10" description = "Measures the displayed width of unicode strings in a terminal" optional = false python-versions = "*" files = [ - {file = "wcwidth-0.2.9-py2.py3-none-any.whl", hash = "sha256:9a929bd8380f6cd9571a968a9c8f4353ca58d7cd812a4822bba831f8d685b223"}, - {file = "wcwidth-0.2.9.tar.gz", hash = "sha256:a675d1a4a2d24ef67096a04b85b02deeecd8e226f57b5e3a72dbb9ed99d27da8"}, + {file = "wcwidth-0.2.10-py2.py3-none-any.whl", hash = "sha256:aec5179002dd0f0d40c456026e74a729661c9d468e1ed64405e3a6c2176ca36f"}, + {file = "wcwidth-0.2.10.tar.gz", hash = "sha256:390c7454101092a6a5e43baad8f83de615463af459201709556b6e4b1c861f97"}, ] [[package]] name = "yamllint" -version = "1.32.0" +version = "1.33.0" description = "A linter for YAML files." optional = false -python-versions = ">=3.7" +python-versions = ">=3.8" files = [ - {file = "yamllint-1.32.0-py3-none-any.whl", hash = "sha256:d97a66e48da820829d96077d76b8dfbe6c6140f106e558dae87e81ac4e6b30b7"}, - {file = "yamllint-1.32.0.tar.gz", hash = "sha256:d01dde008c65de5b235188ab3110bebc59d18e5c65fc8a58267cd211cd9df34a"}, + {file = "yamllint-1.33.0-py3-none-any.whl", hash = "sha256:28a19f5d68d28d8fec538a1db21bb2d84c7dc2e2ea36266da8d4d1c5a683814d"}, + {file = "yamllint-1.33.0.tar.gz", hash = "sha256:2dceab9ef2d99518a2fcf4ffc964d44250ac4459be1ba3ca315118e4a1a81f7d"}, ] [package.dependencies] @@ -1792,4 +1969,4 @@ dev = ["doc8", "flake8", "flake8-import-order", "rstcheck[sphinx]", "sphinx"] [metadata] lock-version = "2.0" python-versions = "^3.8" -content-hash = "c99d3b78ea0bf7c735448b0aa273131305358985357d570b5ae9b05567d3588e" +content-hash = "e321dd0d570bb3b10c7363eeba6cd62a973ffc49d9417d176dbfaf3e89e953d1" diff --git a/python_sdk/pyproject.toml b/python_sdk/pyproject.toml index 2eeb83cb4d..c4d1c59c77 100644 --- a/python_sdk/pyproject.toml +++ b/python_sdk/pyproject.toml @@ -23,7 +23,7 @@ classifiers = [ [tool.poetry.dependencies] python = "^3.8" -httpx = "^0.24.0" +httpx = "^0.23" rich = "^13.3" pendulum = "~2.1" typer = "^0.7" @@ -32,12 +32,15 @@ toml = "^0.10.2" jsonlines = "^3.1" deepdiff = "^6.2" ujson = "^5.7" -pydantic = "^1.10, ^2" +pydantic = ">=1.7.4,!=1.8,!=1.8.1,!=2.0.0,!=2.0.1,!=2.1.0,<3.0.0" pyyaml = "^6.0" gitpython = "3.1.40" [tool.poetry.group.dev.dependencies] pytest = "*" +pytest-clarity = "^1.0.1" +pytest-cov = "^4.0.0" +pytest-httpx = "*" yamllint = "*" pylint = "*" mypy = "*" @@ -46,13 +49,10 @@ pytest-asyncio = "*" requests = "*" pre-commit = "^2.20.0" autoflake = "*" -pytest-clarity = "^1.0.1" -pytest-httpx = "^0.22" types-toml = "*" types-ujson = "*" types-pyyaml = "*" typer-cli = "*" -pytest-cov = "^4.0.0" ruff = "^0.1.5" pytest-xdist = "^3.3.1" buildkite-test-collector = "^0.1.7" @@ -193,7 +193,7 @@ skip-magic-trailing-comma = false line-ending = "auto" [tool.ruff.lint.isort] -known-first-party = ["infrahub_sdk"] +known-first-party = ["infrahub_sdk", "infrahub_ctl"] [tool.ruff.lint.pycodestyle] max-line-length = 150 From 3ee0a48d7d19d90a540371a8904479fa9979d67d Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Wed, 15 Nov 2023 21:40:57 -0500 Subject: [PATCH 048/446] Force Pydantic to v1 in Infrahub --- poetry.lock | 460 ++++++++++++++++++++++++++----------------------- pyproject.toml | 2 +- 2 files changed, 245 insertions(+), 217 deletions(-) diff --git a/poetry.lock b/poetry.lock index a3fa8eadf7..ac5a1e44bc 100644 --- a/poetry.lock +++ b/poetry.lock @@ -86,6 +86,17 @@ doc = ["Sphinx (>=7)", "packaging", "sphinx-autodoc-typehints (>=1.2.0)"] test = ["anyio[trio]", "coverage[toml] (>=7)", "hypothesis (>=4.0)", "psutil (>=5.9)", "pytest (>=7.0)", "pytest-mock (>=3.6.1)", "trustme", "uvloop (>=0.17)"] trio = ["trio (>=0.22)"] +[[package]] +name = "appdirs" +version = "1.4.4" +description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"." +optional = false +python-versions = "*" +files = [ + {file = "appdirs-1.4.4-py2.py3-none-any.whl", hash = "sha256:a841dacd6b99318a741b166adb07e19ee71a274450e68237b4650ca1055ab128"}, + {file = "appdirs-1.4.4.tar.gz", hash = "sha256:7d5d0167b2b1ba821647616af46a749d1c653740dd0d2415100fe26e27afdf41"}, +] + [[package]] name = "appnope" version = "0.1.3" @@ -710,31 +721,31 @@ pytz = ">2021.1" [[package]] name = "dagit" -version = "1.5.6" +version = "1.5.8" description = "Web UI for dagster." optional = false python-versions = "*" files = [ - {file = "dagit-1.5.6-py3-none-any.whl", hash = "sha256:57e967173ffc62b36cc31726d9c9a8e21fd8c8d67cab79f8dce16b8b51c0b926"}, - {file = "dagit-1.5.6.tar.gz", hash = "sha256:cc4a7606f8b1129922f5efe9b7f894a6fc00d257f6a26f5ca8d69f98fcff6c23"}, + {file = "dagit-1.5.8-py3-none-any.whl", hash = "sha256:7fac1f954c7634d8f094587d59514d600a086943258b9ae41fbf8e73e0aae872"}, + {file = "dagit-1.5.8.tar.gz", hash = "sha256:66df7996cb8dbbc8574206d7f269e2c277930e23a10c81d4f37c61f4885bd715"}, ] [package.dependencies] -dagster-webserver = "1.5.6" +dagster-webserver = "1.5.8" [package.extras] -notebook = ["dagster-webserver[notebook] (==1.5.6)"] -test = ["dagster-webserver[test] (==1.5.6)"] +notebook = ["dagster-webserver[notebook] (==1.5.8)"] +test = ["dagster-webserver[test] (==1.5.8)"] [[package]] name = "dagster" -version = "1.5.6" +version = "1.5.8" description = "Dagster is an orchestration platform for the development, production, and observation of data assets." optional = false python-versions = "*" files = [ - {file = "dagster-1.5.6-py3-none-any.whl", hash = "sha256:662e4d6bfed90251c244ea6fe4a3ce93ffcae7c82c572b9c2eec17ff5547fc3d"}, - {file = "dagster-1.5.6.tar.gz", hash = "sha256:f8d71f15f177f5421b5595330bd7b2a253588a0fac0c44bdc933b0f15ad5c6f5"}, + {file = "dagster-1.5.8-py3-none-any.whl", hash = "sha256:d4aa3d3b76c8add5bdbe16a749c722bd5ef684138892084767610b1b87422b6f"}, + {file = "dagster-1.5.8.tar.gz", hash = "sha256:98ad30e7bc45f6b04a3d0d4672270566d22d401ab43c0d2c647fb78864954ba2"}, ] [package.dependencies] @@ -742,13 +753,13 @@ alembic = ">=1.2.1,<1.6.3 || >1.6.3,<1.7.0 || >1.7.0,<1.11.0 || >1.11.0" click = ">=5.0" coloredlogs = ">=6.1,<=14.0" croniter = ">=0.3.34" -dagster-pipes = "1.5.6" +dagster-pipes = "1.5.8" docstring-parser = "*" grpcio = ">=1.44.0" grpcio-health-checking = ">=1.44.0" Jinja2 = "*" packaging = ">=20.9" -pendulum = "*" +pendulum = "<3" protobuf = ">=3.20.0" psutil = {version = ">=1.0", markers = "platform_system == \"Windows\""} pydantic = ">1.10.0,<1.10.7 || >1.10.7" @@ -777,17 +788,17 @@ test = ["buildkite-test-collector", "docker", "grpcio-tools (>=1.44.0)", "mock ( [[package]] name = "dagster-graphql" -version = "1.5.6" +version = "1.5.8" description = "The GraphQL frontend to python dagster." optional = false python-versions = "*" files = [ - {file = "dagster-graphql-1.5.6.tar.gz", hash = "sha256:5d77e1ca22c930a9256b36eeb04a418371ab1e609a92b08fcc8853134f2b4b48"}, - {file = "dagster_graphql-1.5.6-py3-none-any.whl", hash = "sha256:40fbb4d9c87265308524a8e94d1b07ee7448eaab43730259ff7a9585ac4e8b79"}, + {file = "dagster-graphql-1.5.8.tar.gz", hash = "sha256:04d67fb76db7d75e0d7e839c1191214543c983dcb7085d5e7ba3b9d5573c81f7"}, + {file = "dagster_graphql-1.5.8-py3-none-any.whl", hash = "sha256:408d709e3c2c83e5abbf5416ef1bbde23adbdee93e6938fedd2681a566dd521b"}, ] [package.dependencies] -dagster = "1.5.6" +dagster = "1.5.8" gql = {version = ">=3.0.0", extras = ["requests"]} graphene = ">=3" requests = "*" @@ -795,30 +806,30 @@ starlette = "*" [[package]] name = "dagster-pipes" -version = "1.5.6" +version = "1.5.8" description = "Toolkit for Dagster integrations with transform logic outside of Dagster" optional = false python-versions = "*" files = [ - {file = "dagster-pipes-1.5.6.tar.gz", hash = "sha256:eed934bc2e40d2edee173bbfbb4efc04c5b800a7f89ad9cb7bb8511cf8d945ed"}, - {file = "dagster_pipes-1.5.6-py3-none-any.whl", hash = "sha256:eb33f65f1d3c62d7d56de03977f4eb8996b2ff108cdab77da85ee24701ad70e7"}, + {file = "dagster-pipes-1.5.8.tar.gz", hash = "sha256:b632a5aad45f6fc788731c6ef3b0afb167299dd5910c1212fbedcf5595ca11ff"}, + {file = "dagster_pipes-1.5.8-py3-none-any.whl", hash = "sha256:719201a63193a67294bde2936852ef848b97dbaadcd56e1316fa3eb08afef7e2"}, ] [[package]] name = "dagster-webserver" -version = "1.5.6" +version = "1.5.8" description = "Web UI for dagster." optional = false python-versions = "*" files = [ - {file = "dagster_webserver-1.5.6-py3-none-any.whl", hash = "sha256:80af5e3af3059cdd79b0549f63335ffbebb85a5901dfd9c9b97a6756146dbc5f"}, - {file = "dagster_webserver-1.5.6.tar.gz", hash = "sha256:679dc38fa5a346773ac47db20bd2032fc138b3a379425f240950de1e65d03335"}, + {file = "dagster_webserver-1.5.8-py3-none-any.whl", hash = "sha256:5a9df81ea4417b0a22a7a2b7926c9ed4ce4513fe20ce43c1cd791ddaa2290329"}, + {file = "dagster_webserver-1.5.8.tar.gz", hash = "sha256:e80576b9e18f0a8781c2cfee17ec4027c5a54f4f9edc87aa15f9e1a28e4d9cd0"}, ] [package.dependencies] click = ">=7.0,<9.0" -dagster = "1.5.6" -dagster-graphql = "1.5.6" +dagster = "1.5.8" +dagster-graphql = "1.5.8" starlette = "*" uvicorn = {version = "*", extras = ["standard"]} @@ -839,13 +850,13 @@ files = [ [[package]] name = "deepdiff" -version = "6.7.0" +version = "6.7.1" description = "Deep Difference and Search of any Python object/data. Recreate objects by adding adding deltas to each other." optional = false python-versions = ">=3.7" files = [ - {file = "deepdiff-6.7.0-py3-none-any.whl", hash = "sha256:d64dd64be5b2e3917c7cc557d69e68d008d798a5cd4981d1508707037504d50a"}, - {file = "deepdiff-6.7.0.tar.gz", hash = "sha256:4c60fc1da4ac12aa73de98b7f303971607c6f928867fabf143cd51a434badb7d"}, + {file = "deepdiff-6.7.1-py3-none-any.whl", hash = "sha256:58396bb7a863cbb4ed5193f548c56f18218060362311aa1dc36397b2f25108bd"}, + {file = "deepdiff-6.7.1.tar.gz", hash = "sha256:b367e6fa6caac1c9f500adc79ada1b5b1242c50d5f716a1a4362030197847d30"}, ] [package.dependencies] @@ -1718,13 +1729,13 @@ files = [ [[package]] name = "httpcore" -version = "0.17.3" +version = "0.16.3" description = "A minimal low-level HTTP client." optional = false python-versions = ">=3.7" files = [ - {file = "httpcore-0.17.3-py3-none-any.whl", hash = "sha256:c2789b767ddddfa2a5782e3199b2b7f6894540b17b16ec26b2c4d8e103510b87"}, - {file = "httpcore-0.17.3.tar.gz", hash = "sha256:a6f30213335e34c1ade7be6ec7c47f19f50c56db36abef1a9dfa3815b1cb3888"}, + {file = "httpcore-0.16.3-py3-none-any.whl", hash = "sha256:da1fb708784a938aa084bde4feb8317056c55037247c787bd7e19eb2c2949dc0"}, + {file = "httpcore-0.16.3.tar.gz", hash = "sha256:c5d6f04e2fc530f39e0c077e6a30caa53f1451096120f1f38b954afd0b17c0cb"}, ] [package.dependencies] @@ -1787,24 +1798,24 @@ test = ["Cython (>=0.29.24,<0.30.0)"] [[package]] name = "httpx" -version = "0.24.1" +version = "0.23.3" description = "The next generation HTTP client." optional = false python-versions = ">=3.7" files = [ - {file = "httpx-0.24.1-py3-none-any.whl", hash = "sha256:06781eb9ac53cde990577af654bd990a4949de37a28bdb4a230d434f3a30b9bd"}, - {file = "httpx-0.24.1.tar.gz", hash = "sha256:5853a43053df830c20f8110c5e69fe44d035d850b2dfe795e196f00fdb774bdd"}, + {file = "httpx-0.23.3-py3-none-any.whl", hash = "sha256:a211fcce9b1254ea24f0cd6af9869b3d29aba40154e947d2a07bb499b3e310d6"}, + {file = "httpx-0.23.3.tar.gz", hash = "sha256:9818458eb565bb54898ccb9b8b251a28785dd4a55afbc23d0eb410754fe7d0f9"}, ] [package.dependencies] certifi = "*" -httpcore = ">=0.15.0,<0.18.0" -idna = "*" +httpcore = ">=0.15.0,<0.17.0" +rfc3986 = {version = ">=1.3,<2", extras = ["idna2008"]} sniffio = "*" [package.extras] brotli = ["brotli", "brotlicffi"] -cli = ["click (==8.*)", "pygments (==2.*)", "rich (>=10,<14)"] +cli = ["click (==8.*)", "pygments (==2.*)", "rich (>=10,<13)"] http2 = ["h2 (>=3,<5)"] socks = ["socksio (==1.*)"] @@ -1896,11 +1907,11 @@ develop = true [package.dependencies] deepdiff = "^6.2" gitpython = "3.1.40" -httpx = "^0.24.0" +httpx = "^0.23" Jinja2 = "^3.1" jsonlines = "^3.1" pendulum = "~2.1" -pydantic = "^1.10" +pydantic = ">=1.7.4,!=1.8,!=1.8.1,!=2.0.0,!=2.0.1,!=2.1.0,<3.0.0" pyyaml = "^6.0" rich = "^13.3" toml = "^0.10.2" @@ -2041,13 +2052,13 @@ attrs = ">=19.2.0" [[package]] name = "locust" -version = "2.18.1" +version = "2.18.3" description = "Developer friendly load testing framework" optional = false python-versions = ">=3.8" files = [ - {file = "locust-2.18.1-py3-none-any.whl", hash = "sha256:95cb37490487db693b9a85914ca13a962d74de7b9588519909a5a678512dfaef"}, - {file = "locust-2.18.1.tar.gz", hash = "sha256:503985a240f8f8098636c9493689caca2cc923b83f241393bb2c92bde5801278"}, + {file = "locust-2.18.3-py3-none-any.whl", hash = "sha256:af895c029b1b2f8fee12c2877119676fe4b77f955752d2c6d21a17659ebe87ed"}, + {file = "locust-2.18.3.tar.gz", hash = "sha256:a5ffd8f18c6d4d8a5c284bf5b6da5e1b4712e8e2217a161ab6857ece38767207"}, ] [package.dependencies] @@ -2067,13 +2078,13 @@ Werkzeug = ">=2.0.0" [[package]] name = "mako" -version = "1.2.4" +version = "1.3.0" description = "A super-fast templating language that borrows the best ideas from the existing templating languages." optional = false -python-versions = ">=3.7" +python-versions = ">=3.8" files = [ - {file = "Mako-1.2.4-py3-none-any.whl", hash = "sha256:c97c79c018b9165ac9922ae4f32da095ffd3c4e6872b45eded42926deea46818"}, - {file = "Mako-1.2.4.tar.gz", hash = "sha256:d60a3903dc3bb01a18ad6a89cdbe2e4eadc69c0bc8ef1e3773ba53d44c3f7a34"}, + {file = "Mako-1.3.0-py3-none-any.whl", hash = "sha256:57d4e997349f1a92035aa25c17ace371a4213f2ca42f99bee9a602500cfd54d9"}, + {file = "Mako-1.3.0.tar.gz", hash = "sha256:e3a9d388fd00e87043edbe8792f45880ac0114e9c4adc69f6e9bfb2c55e3b11b"}, ] [package.dependencies] @@ -2135,6 +2146,16 @@ files = [ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"}, + {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"}, @@ -2353,38 +2374,38 @@ files = [ [[package]] name = "mypy" -version = "1.6.1" +version = "1.7.0" description = "Optional static typing for Python" optional = false python-versions = ">=3.8" files = [ - {file = "mypy-1.6.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e5012e5cc2ac628177eaac0e83d622b2dd499e28253d4107a08ecc59ede3fc2c"}, - {file = "mypy-1.6.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d8fbb68711905f8912e5af474ca8b78d077447d8f3918997fecbf26943ff3cbb"}, - {file = "mypy-1.6.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21a1ad938fee7d2d96ca666c77b7c494c3c5bd88dff792220e1afbebb2925b5e"}, - {file = "mypy-1.6.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:b96ae2c1279d1065413965c607712006205a9ac541895004a1e0d4f281f2ff9f"}, - {file = "mypy-1.6.1-cp310-cp310-win_amd64.whl", hash = "sha256:40b1844d2e8b232ed92e50a4bd11c48d2daa351f9deee6c194b83bf03e418b0c"}, - {file = "mypy-1.6.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:81af8adaa5e3099469e7623436881eff6b3b06db5ef75e6f5b6d4871263547e5"}, - {file = "mypy-1.6.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8c223fa57cb154c7eab5156856c231c3f5eace1e0bed9b32a24696b7ba3c3245"}, - {file = "mypy-1.6.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a8032e00ce71c3ceb93eeba63963b864bf635a18f6c0c12da6c13c450eedb183"}, - {file = "mypy-1.6.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:4c46b51de523817a0045b150ed11b56f9fff55f12b9edd0f3ed35b15a2809de0"}, - {file = "mypy-1.6.1-cp311-cp311-win_amd64.whl", hash = "sha256:19f905bcfd9e167159b3d63ecd8cb5e696151c3e59a1742e79bc3bcb540c42c7"}, - {file = "mypy-1.6.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:82e469518d3e9a321912955cc702d418773a2fd1e91c651280a1bda10622f02f"}, - {file = "mypy-1.6.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:d4473c22cc296425bbbce7e9429588e76e05bc7342da359d6520b6427bf76660"}, - {file = "mypy-1.6.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:59a0d7d24dfb26729e0a068639a6ce3500e31d6655df8557156c51c1cb874ce7"}, - {file = "mypy-1.6.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:cfd13d47b29ed3bbaafaff7d8b21e90d827631afda134836962011acb5904b71"}, - {file = "mypy-1.6.1-cp312-cp312-win_amd64.whl", hash = "sha256:eb4f18589d196a4cbe5290b435d135dee96567e07c2b2d43b5c4621b6501531a"}, - {file = "mypy-1.6.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:41697773aa0bf53ff917aa077e2cde7aa50254f28750f9b88884acea38a16169"}, - {file = "mypy-1.6.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7274b0c57737bd3476d2229c6389b2ec9eefeb090bbaf77777e9d6b1b5a9d143"}, - {file = "mypy-1.6.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbaf4662e498c8c2e352da5f5bca5ab29d378895fa2d980630656178bd607c46"}, - {file = "mypy-1.6.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:bb8ccb4724f7d8601938571bf3f24da0da791fe2db7be3d9e79849cb64e0ae85"}, - {file = "mypy-1.6.1-cp38-cp38-win_amd64.whl", hash = "sha256:68351911e85145f582b5aa6cd9ad666c8958bcae897a1bfda8f4940472463c45"}, - {file = "mypy-1.6.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:49ae115da099dcc0922a7a895c1eec82c1518109ea5c162ed50e3b3594c71208"}, - {file = "mypy-1.6.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8b27958f8c76bed8edaa63da0739d76e4e9ad4ed325c814f9b3851425582a3cd"}, - {file = "mypy-1.6.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:925cd6a3b7b55dfba252b7c4561892311c5358c6b5a601847015a1ad4eb7d332"}, - {file = "mypy-1.6.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8f57e6b6927a49550da3d122f0cb983d400f843a8a82e65b3b380d3d7259468f"}, - {file = "mypy-1.6.1-cp39-cp39-win_amd64.whl", hash = "sha256:a43ef1c8ddfdb9575691720b6352761f3f53d85f1b57d7745701041053deff30"}, - {file = "mypy-1.6.1-py3-none-any.whl", hash = "sha256:4cbe68ef919c28ea561165206a2dcb68591c50f3bcf777932323bc208d949cf1"}, - {file = "mypy-1.6.1.tar.gz", hash = "sha256:4d01c00d09a0be62a4ca3f933e315455bde83f37f892ba4b08ce92f3cf44bcc1"}, + {file = "mypy-1.7.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:5da84d7bf257fd8f66b4f759a904fd2c5a765f70d8b52dde62b521972a0a2357"}, + {file = "mypy-1.7.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a3637c03f4025f6405737570d6cbfa4f1400eb3c649317634d273687a09ffc2f"}, + {file = "mypy-1.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b633f188fc5ae1b6edca39dae566974d7ef4e9aaaae00bc36efe1f855e5173ac"}, + {file = "mypy-1.7.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d6ed9a3997b90c6f891138e3f83fb8f475c74db4ccaa942a1c7bf99e83a989a1"}, + {file = "mypy-1.7.0-cp310-cp310-win_amd64.whl", hash = "sha256:1fe46e96ae319df21359c8db77e1aecac8e5949da4773c0274c0ef3d8d1268a9"}, + {file = "mypy-1.7.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:df67fbeb666ee8828f675fee724cc2cbd2e4828cc3df56703e02fe6a421b7401"}, + {file = "mypy-1.7.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a79cdc12a02eb526d808a32a934c6fe6df07b05f3573d210e41808020aed8b5d"}, + {file = "mypy-1.7.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f65f385a6f43211effe8c682e8ec3f55d79391f70a201575def73d08db68ead1"}, + {file = "mypy-1.7.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:0e81ffd120ee24959b449b647c4b2fbfcf8acf3465e082b8d58fd6c4c2b27e46"}, + {file = "mypy-1.7.0-cp311-cp311-win_amd64.whl", hash = "sha256:f29386804c3577c83d76520abf18cfcd7d68264c7e431c5907d250ab502658ee"}, + {file = "mypy-1.7.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:87c076c174e2c7ef8ab416c4e252d94c08cd4980a10967754f91571070bf5fbe"}, + {file = "mypy-1.7.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6cb8d5f6d0fcd9e708bb190b224089e45902cacef6f6915481806b0c77f7786d"}, + {file = "mypy-1.7.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d93e76c2256aa50d9c82a88e2f569232e9862c9982095f6d54e13509f01222fc"}, + {file = "mypy-1.7.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:cddee95dea7990e2215576fae95f6b78a8c12f4c089d7e4367564704e99118d3"}, + {file = "mypy-1.7.0-cp312-cp312-win_amd64.whl", hash = "sha256:d01921dbd691c4061a3e2ecdbfbfad029410c5c2b1ee88946bf45c62c6c91210"}, + {file = "mypy-1.7.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:185cff9b9a7fec1f9f7d8352dff8a4c713b2e3eea9c6c4b5ff7f0edf46b91e41"}, + {file = "mypy-1.7.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7a7b1e399c47b18feb6f8ad4a3eef3813e28c1e871ea7d4ea5d444b2ac03c418"}, + {file = "mypy-1.7.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fc9fe455ad58a20ec68599139ed1113b21f977b536a91b42bef3ffed5cce7391"}, + {file = "mypy-1.7.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d0fa29919d2e720c8dbaf07d5578f93d7b313c3e9954c8ec05b6d83da592e5d9"}, + {file = "mypy-1.7.0-cp38-cp38-win_amd64.whl", hash = "sha256:2b53655a295c1ed1af9e96b462a736bf083adba7b314ae775563e3fb4e6795f5"}, + {file = "mypy-1.7.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c1b06b4b109e342f7dccc9efda965fc3970a604db70f8560ddfdee7ef19afb05"}, + {file = "mypy-1.7.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:bf7a2f0a6907f231d5e41adba1a82d7d88cf1f61a70335889412dec99feeb0f8"}, + {file = "mypy-1.7.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:551d4a0cdcbd1d2cccdcc7cb516bb4ae888794929f5b040bb51aae1846062901"}, + {file = "mypy-1.7.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:55d28d7963bef00c330cb6461db80b0b72afe2f3c4e2963c99517cf06454e665"}, + {file = "mypy-1.7.0-cp39-cp39-win_amd64.whl", hash = "sha256:870bd1ffc8a5862e593185a4c169804f2744112b4a7c55b93eb50f48e7a77010"}, + {file = "mypy-1.7.0-py3-none-any.whl", hash = "sha256:96650d9a4c651bc2a4991cf46f100973f656d69edc7faf91844e87fe627f7e96"}, + {file = "mypy-1.7.0.tar.gz", hash = "sha256:1e280b5697202efa698372d2f39e9a6713a0395a756b1c6bd48995f8d72690dc"}, ] [package.dependencies] @@ -2395,6 +2416,7 @@ typing-extensions = ">=4.1.0" [package.extras] dmypy = ["psutil (>=4.0)"] install-types = ["pip"] +mypyc = ["setuptools (>=50)"] reports = ["lxml"] [[package]] @@ -2738,13 +2760,13 @@ files = [ [[package]] name = "platformdirs" -version = "3.11.0" +version = "4.0.0" description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"." optional = false python-versions = ">=3.7" files = [ - {file = "platformdirs-3.11.0-py3-none-any.whl", hash = "sha256:e9d171d00af68be50e9202731309c4e658fd8bc76f55c11c7dd760d023bda68e"}, - {file = "platformdirs-3.11.0.tar.gz", hash = "sha256:cf8ee52a3afdb965072dcc652433e0c7e3e40cf5ea1477cd4b3b1d2eb75495b3"}, + {file = "platformdirs-4.0.0-py3-none-any.whl", hash = "sha256:118c954d7e949b35437270383a3f2531e99dd93cf7ce4dc8340d3356d30f173b"}, + {file = "platformdirs-4.0.0.tar.gz", hash = "sha256:cb633b2bcf10c51af60beb0ab06d2f1d69064b43abf4c185ca6b28865f3f9731"}, ] [package.extras] @@ -2779,13 +2801,13 @@ files = [ [[package]] name = "pre-commit" -version = "2.21.0" +version = "2.20.0" description = "A framework for managing and maintaining multi-language pre-commit hooks." optional = false python-versions = ">=3.7" files = [ - {file = "pre_commit-2.21.0-py2.py3-none-any.whl", hash = "sha256:e2f91727039fc39a92f58a588a25b87f936de6567eed4f0e673e0507edc75bad"}, - {file = "pre_commit-2.21.0.tar.gz", hash = "sha256:31ef31af7e474a8d8995027fefdfcf509b5c913ff31f2015b4ec4beb26a6f658"}, + {file = "pre_commit-2.20.0-py2.py3-none-any.whl", hash = "sha256:51a5ba7c480ae8072ecdb6933df22d2f812dc897d5fe848778116129a681aac7"}, + {file = "pre_commit-2.20.0.tar.gz", hash = "sha256:a978dac7bc9ec0bcee55c18a277d553b0f419d259dadb4b9418ff2d00eb43959"}, ] [package.dependencies] @@ -2793,7 +2815,8 @@ cfgv = ">=2.0.0" identify = ">=1.0.0" nodeenv = ">=0.11.1" pyyaml = ">=5.1" -virtualenv = ">=20.10.0" +toml = "*" +virtualenv = ">=20.0.8" [[package]] name = "prometheus-client" @@ -2811,13 +2834,13 @@ twisted = ["twisted"] [[package]] name = "prompt-toolkit" -version = "3.0.39" +version = "3.0.41" description = "Library for building powerful interactive command lines in Python" optional = false python-versions = ">=3.7.0" files = [ - {file = "prompt_toolkit-3.0.39-py3-none-any.whl", hash = "sha256:9dffbe1d8acf91e3de75f3b544e4842382fc06c6babe903ac9acb74dc6e08d88"}, - {file = "prompt_toolkit-3.0.39.tar.gz", hash = "sha256:04505ade687dc26dc4284b1ad19a83be2f2afe83e7a828ace0c72f3a1df72aac"}, + {file = "prompt_toolkit-3.0.41-py3-none-any.whl", hash = "sha256:f36fe301fafb7470e86aaf90f036eef600a3210be4decf461a5b1ca8403d3cb2"}, + {file = "prompt_toolkit-3.0.41.tar.gz", hash = "sha256:941367d97fc815548822aa26c2a269fdc4eb21e9ec05fc5d447cf09bad5d75f0"}, ] [package.dependencies] @@ -2825,22 +2848,22 @@ wcwidth = "*" [[package]] name = "protobuf" -version = "4.25.0" +version = "4.25.1" description = "" optional = false python-versions = ">=3.8" files = [ - {file = "protobuf-4.25.0-cp310-abi3-win32.whl", hash = "sha256:5c1203ac9f50e4853b0a0bfffd32c67118ef552a33942982eeab543f5c634395"}, - {file = "protobuf-4.25.0-cp310-abi3-win_amd64.whl", hash = "sha256:c40ff8f00aa737938c5378d461637d15c442a12275a81019cc2fef06d81c9419"}, - {file = "protobuf-4.25.0-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:cf21faba64cd2c9a3ed92b7a67f226296b10159dbb8fbc5e854fc90657d908e4"}, - {file = "protobuf-4.25.0-cp37-abi3-manylinux2014_aarch64.whl", hash = "sha256:32ac2100b0e23412413d948c03060184d34a7c50b3e5d7524ee96ac2b10acf51"}, - {file = "protobuf-4.25.0-cp37-abi3-manylinux2014_x86_64.whl", hash = "sha256:683dc44c61f2620b32ce4927de2108f3ebe8ccf2fd716e1e684e5a50da154054"}, - {file = "protobuf-4.25.0-cp38-cp38-win32.whl", hash = "sha256:1a3ba712877e6d37013cdc3476040ea1e313a6c2e1580836a94f76b3c176d575"}, - {file = "protobuf-4.25.0-cp38-cp38-win_amd64.whl", hash = "sha256:b2cf8b5d381f9378afe84618288b239e75665fe58d0f3fd5db400959274296e9"}, - {file = "protobuf-4.25.0-cp39-cp39-win32.whl", hash = "sha256:63714e79b761a37048c9701a37438aa29945cd2417a97076048232c1df07b701"}, - {file = "protobuf-4.25.0-cp39-cp39-win_amd64.whl", hash = "sha256:d94a33db8b7ddbd0af7c467475fb9fde0c705fb315a8433c0e2020942b863a1f"}, - {file = "protobuf-4.25.0-py3-none-any.whl", hash = "sha256:1a53d6f64b00eecf53b65ff4a8c23dc95df1fa1e97bb06b8122e5a64f49fc90a"}, - {file = "protobuf-4.25.0.tar.gz", hash = "sha256:68f7caf0d4f012fd194a301420cf6aa258366144d814f358c5b32558228afa7c"}, + {file = "protobuf-4.25.1-cp310-abi3-win32.whl", hash = "sha256:193f50a6ab78a970c9b4f148e7c750cfde64f59815e86f686c22e26b4fe01ce7"}, + {file = "protobuf-4.25.1-cp310-abi3-win_amd64.whl", hash = "sha256:3497c1af9f2526962f09329fd61a36566305e6c72da2590ae0d7d1322818843b"}, + {file = "protobuf-4.25.1-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:0bf384e75b92c42830c0a679b0cd4d6e2b36ae0cf3dbb1e1dfdda48a244f4bcd"}, + {file = "protobuf-4.25.1-cp37-abi3-manylinux2014_aarch64.whl", hash = "sha256:0f881b589ff449bf0b931a711926e9ddaad3b35089cc039ce1af50b21a4ae8cb"}, + {file = "protobuf-4.25.1-cp37-abi3-manylinux2014_x86_64.whl", hash = "sha256:ca37bf6a6d0046272c152eea90d2e4ef34593aaa32e8873fc14c16440f22d4b7"}, + {file = "protobuf-4.25.1-cp38-cp38-win32.whl", hash = "sha256:abc0525ae2689a8000837729eef7883b9391cd6aa7950249dcf5a4ede230d5dd"}, + {file = "protobuf-4.25.1-cp38-cp38-win_amd64.whl", hash = "sha256:1484f9e692091450e7edf418c939e15bfc8fc68856e36ce399aed6889dae8bb0"}, + {file = "protobuf-4.25.1-cp39-cp39-win32.whl", hash = "sha256:8bdbeaddaac52d15c6dce38c71b03038ef7772b977847eb6d374fc86636fa510"}, + {file = "protobuf-4.25.1-cp39-cp39-win_amd64.whl", hash = "sha256:becc576b7e6b553d22cbdf418686ee4daa443d7217999125c045ad56322dda10"}, + {file = "protobuf-4.25.1-py3-none-any.whl", hash = "sha256:a19731d5e83ae4737bb2a089605e636077ac001d18781b3cf489b9546c7c80d6"}, + {file = "protobuf-4.25.1.tar.gz", hash = "sha256:57d65074b4f5baa4ab5da1605c02be90ac20c8b40fb137d6a8df9f416b0d0ce2"}, ] [[package]] @@ -3144,33 +3167,15 @@ pytest = ">=4.6" [package.extras] testing = ["fields", "hunter", "process-tests", "pytest-xdist", "six", "virtualenv"] -[[package]] -name = "pytest-httpx" -version = "0.22.0" -description = "Send responses to httpx." -optional = false -python-versions = ">=3.7" -files = [ - {file = "pytest_httpx-0.22.0-py3-none-any.whl", hash = "sha256:cefb7dcf66a4cb0601b0de05e576cca423b6081f3245e7912a4d84c58fa3eae8"}, - {file = "pytest_httpx-0.22.0.tar.gz", hash = "sha256:3a82797f3a9a14d51e8c6b7fa97524b68b847ee801109c062e696b4744f4431c"}, -] - -[package.dependencies] -httpx = "==0.24.*" -pytest = ">=6.0,<8.0" - -[package.extras] -testing = ["pytest-asyncio (==0.20.*)", "pytest-cov (==4.*)"] - [[package]] name = "pytest-xdist" -version = "3.3.1" +version = "3.4.0" description = "pytest xdist plugin for distributed testing, most importantly across multiple CPUs" optional = false python-versions = ">=3.7" files = [ - {file = "pytest-xdist-3.3.1.tar.gz", hash = "sha256:d5ee0520eb1b7bcca50a60a518ab7a7707992812c578198f8b44fdfac78e8c93"}, - {file = "pytest_xdist-3.3.1-py3-none-any.whl", hash = "sha256:ff9daa7793569e6a68544850fd3927cd257cc03a7ef76c95e86915355e82b5f2"}, + {file = "pytest-xdist-3.4.0.tar.gz", hash = "sha256:3a94a931dd9e268e0b871a877d09fe2efb6175c2c23d60d56a6001359002b832"}, + {file = "pytest_xdist-3.4.0-py3-none-any.whl", hash = "sha256:e513118bf787677a427e025606f55e95937565e06dfaac8d87f55301e57ae607"}, ] [package.dependencies] @@ -3281,6 +3286,7 @@ files = [ {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"}, + {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"}, {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"}, {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"}, {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"}, @@ -3288,8 +3294,15 @@ files = [ {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"}, + {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"}, {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"}, {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"}, + {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"}, + {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"}, + {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"}, + {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"}, + {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"}, + {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"}, {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"}, @@ -3306,6 +3319,7 @@ files = [ {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"}, + {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"}, {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"}, {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"}, {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"}, @@ -3313,6 +3327,7 @@ files = [ {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"}, + {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"}, {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"}, {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"}, {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"}, @@ -3477,15 +3492,32 @@ files = [ [package.dependencies] requests = ">=2.0.1,<3.0.0" +[[package]] +name = "rfc3986" +version = "1.5.0" +description = "Validating URI References per RFC 3986" +optional = false +python-versions = "*" +files = [ + {file = "rfc3986-1.5.0-py2.py3-none-any.whl", hash = "sha256:a86d6e1f5b1dc238b218b012df0aa79409667bb209e58da56d0b94704e712a97"}, + {file = "rfc3986-1.5.0.tar.gz", hash = "sha256:270aaf10d87d0d4e095063c65bf3ddbc6ee3d0b226328ce21e036f946e421835"}, +] + +[package.dependencies] +idna = {version = "*", optional = true, markers = "extra == \"idna2008\""} + +[package.extras] +idna2008 = ["idna"] + [[package]] name = "rich" -version = "13.6.0" +version = "13.7.0" description = "Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal" optional = false python-versions = ">=3.7.0" files = [ - {file = "rich-13.6.0-py3-none-any.whl", hash = "sha256:2b38e2fe9ca72c9a00170a1a2d20c63c790d0e10ef1fe35eba76e1e7b1d7d245"}, - {file = "rich-13.6.0.tar.gz", hash = "sha256:5c14d22737e6d5084ef4771b62d5d4363165b403455a30a1c8ca39dc7b644bef"}, + {file = "rich-13.7.0-py3-none-any.whl", hash = "sha256:6da14c108c4866ee9520bbffa71f6fe3962e193b7da68720583850cd4548e235"}, + {file = "rich-13.7.0.tar.gz", hash = "sha256:5cb5123b5cf9ee70584244246816e9114227e0b98ad9176eede6ad54bf5403fa"}, ] [package.dependencies] @@ -3786,13 +3818,13 @@ files = [ [[package]] name = "tomlkit" -version = "0.12.2" +version = "0.12.3" description = "Style preserving TOML library" optional = false python-versions = ">=3.7" files = [ - {file = "tomlkit-0.12.2-py3-none-any.whl", hash = "sha256:eeea7ac7563faeab0a1ed8fe12c2e5a51c61f933f2502f7e9db0241a65163ad0"}, - {file = "tomlkit-0.12.2.tar.gz", hash = "sha256:df32fab589a81f0d7dc525a4267b6d7a64ee99619cbd1eeb0fae32c1dd426977"}, + {file = "tomlkit-0.12.3-py3-none-any.whl", hash = "sha256:b0a645a9156dc7cb5d3a1f0d4bab66db287fcb8e0430bdd4664a095ea16414ba"}, + {file = "tomlkit-0.12.3.tar.gz", hash = "sha256:75baf5012d06501f07bee5bf8e801b9f343e7aac5a92581f20f80ce632e6b5a4"}, ] [[package]] @@ -4107,23 +4139,24 @@ test = ["Cython (>=0.29.36,<0.30.0)", "aiohttp (==3.9.0b0)", "aiohttp (>=3.8.1)" [[package]] name = "virtualenv" -version = "20.24.6" +version = "20.4.7" description = "Virtual Python Environment builder" optional = false -python-versions = ">=3.7" +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,>=2.7" files = [ - {file = "virtualenv-20.24.6-py3-none-any.whl", hash = "sha256:520d056652454c5098a00c0f073611ccbea4c79089331f60bf9d7ba247bb7381"}, - {file = "virtualenv-20.24.6.tar.gz", hash = "sha256:02ece4f56fbf939dbbc33c0715159951d6bf14aaf5457b092e4548e1382455af"}, + {file = "virtualenv-20.4.7-py2.py3-none-any.whl", hash = "sha256:2b0126166ea7c9c3661f5b8e06773d28f83322de7a3ff7d06f0aed18c9de6a76"}, + {file = "virtualenv-20.4.7.tar.gz", hash = "sha256:14fdf849f80dbb29a4eb6caa9875d476ee2a5cf76a5f5415fa2f1606010ab467"}, ] [package.dependencies] -distlib = ">=0.3.7,<1" -filelock = ">=3.12.2,<4" -platformdirs = ">=3.9.1,<4" +appdirs = ">=1.4.3,<2" +distlib = ">=0.3.1,<1" +filelock = ">=3.0.0,<4" +six = ">=1.9.0,<2" [package.extras] -docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.2)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=23.6)"] -test = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.4)", "pytest-env (>=0.8.2)", "pytest-freezer (>=0.4.8)", "pytest-mock (>=3.11.1)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)", "setuptools (>=68)", "time-machine (>=2.10)"] +docs = ["proselint (>=0.10.2)", "sphinx (>=3)", "sphinx-argparse (>=0.2.5)", "sphinx-rtd-theme (>=0.4.3)", "towncrier (>=19.9.0rc1)"] +testing = ["coverage (>=4)", "coverage-enable-subprocess (>=1)", "flaky (>=3)", "packaging (>=20.0)", "pytest (>=4)", "pytest-env (>=0.6.2)", "pytest-freezegun (>=0.4.1)", "pytest-mock (>=2)", "pytest-randomly (>=1)", "pytest-timeout (>=1)", "xonsh (>=0.9.16)"] [[package]] name = "watchdog" @@ -4253,13 +4286,13 @@ anyio = ">=3.0.0" [[package]] name = "wcwidth" -version = "0.2.9" +version = "0.2.10" description = "Measures the displayed width of unicode strings in a terminal" optional = false python-versions = "*" files = [ - {file = "wcwidth-0.2.9-py2.py3-none-any.whl", hash = "sha256:9a929bd8380f6cd9571a968a9c8f4353ca58d7cd812a4822bba831f8d685b223"}, - {file = "wcwidth-0.2.9.tar.gz", hash = "sha256:a675d1a4a2d24ef67096a04b85b02deeecd8e226f57b5e3a72dbb9ed99d27da8"}, + {file = "wcwidth-0.2.10-py2.py3-none-any.whl", hash = "sha256:aec5179002dd0f0d40c456026e74a729661c9d468e1ed64405e3a6c2176ca36f"}, + {file = "wcwidth-0.2.10.tar.gz", hash = "sha256:390c7454101092a6a5e43baad8f83de615463af459201709556b6e4b1c861f97"}, ] [[package]] @@ -4362,97 +4395,92 @@ watchdog = ["watchdog (>=2.3)"] [[package]] name = "wrapt" -version = "1.15.0" +version = "1.16.0" description = "Module for decorators, wrappers and monkey patching." optional = false -python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7" -files = [ - {file = "wrapt-1.15.0-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:ca1cccf838cd28d5a0883b342474c630ac48cac5df0ee6eacc9c7290f76b11c1"}, - {file = "wrapt-1.15.0-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:e826aadda3cae59295b95343db8f3d965fb31059da7de01ee8d1c40a60398b29"}, - {file = "wrapt-1.15.0-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:5fc8e02f5984a55d2c653f5fea93531e9836abbd84342c1d1e17abc4a15084c2"}, - {file = "wrapt-1.15.0-cp27-cp27m-manylinux2010_i686.whl", hash = "sha256:96e25c8603a155559231c19c0349245eeb4ac0096fe3c1d0be5c47e075bd4f46"}, - {file = "wrapt-1.15.0-cp27-cp27m-manylinux2010_x86_64.whl", hash = "sha256:40737a081d7497efea35ab9304b829b857f21558acfc7b3272f908d33b0d9d4c"}, - {file = "wrapt-1.15.0-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:f87ec75864c37c4c6cb908d282e1969e79763e0d9becdfe9fe5473b7bb1e5f09"}, - {file = "wrapt-1.15.0-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:1286eb30261894e4c70d124d44b7fd07825340869945c79d05bda53a40caa079"}, - {file = "wrapt-1.15.0-cp27-cp27mu-manylinux2010_i686.whl", hash = "sha256:493d389a2b63c88ad56cdc35d0fa5752daac56ca755805b1b0c530f785767d5e"}, - {file = "wrapt-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:58d7a75d731e8c63614222bcb21dd992b4ab01a399f1f09dd82af17bbfc2368a"}, - {file = "wrapt-1.15.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:21f6d9a0d5b3a207cdf7acf8e58d7d13d463e639f0c7e01d82cdb671e6cb7923"}, - {file = "wrapt-1.15.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ce42618f67741d4697684e501ef02f29e758a123aa2d669e2d964ff734ee00ee"}, - {file = "wrapt-1.15.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41d07d029dd4157ae27beab04d22b8e261eddfc6ecd64ff7000b10dc8b3a5727"}, - {file = "wrapt-1.15.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:54accd4b8bc202966bafafd16e69da9d5640ff92389d33d28555c5fd4f25ccb7"}, - {file = "wrapt-1.15.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2fbfbca668dd15b744418265a9607baa970c347eefd0db6a518aaf0cfbd153c0"}, - {file = "wrapt-1.15.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:76e9c727a874b4856d11a32fb0b389afc61ce8aaf281ada613713ddeadd1cfec"}, - {file = "wrapt-1.15.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e20076a211cd6f9b44a6be58f7eeafa7ab5720eb796975d0c03f05b47d89eb90"}, - {file = "wrapt-1.15.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:a74d56552ddbde46c246b5b89199cb3fd182f9c346c784e1a93e4dc3f5ec9975"}, - {file = "wrapt-1.15.0-cp310-cp310-win32.whl", hash = "sha256:26458da5653aa5b3d8dc8b24192f574a58984c749401f98fff994d41d3f08da1"}, - {file = "wrapt-1.15.0-cp310-cp310-win_amd64.whl", hash = "sha256:75760a47c06b5974aa5e01949bf7e66d2af4d08cb8c1d6516af5e39595397f5e"}, - {file = "wrapt-1.15.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ba1711cda2d30634a7e452fc79eabcadaffedf241ff206db2ee93dd2c89a60e7"}, - {file = "wrapt-1.15.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:56374914b132c702aa9aa9959c550004b8847148f95e1b824772d453ac204a72"}, - {file = "wrapt-1.15.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a89ce3fd220ff144bd9d54da333ec0de0399b52c9ac3d2ce34b569cf1a5748fb"}, - {file = "wrapt-1.15.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3bbe623731d03b186b3d6b0d6f51865bf598587c38d6f7b0be2e27414f7f214e"}, - {file = "wrapt-1.15.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3abbe948c3cbde2689370a262a8d04e32ec2dd4f27103669a45c6929bcdbfe7c"}, - {file = "wrapt-1.15.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:b67b819628e3b748fd3c2192c15fb951f549d0f47c0449af0764d7647302fda3"}, - {file = "wrapt-1.15.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:7eebcdbe3677e58dd4c0e03b4f2cfa346ed4049687d839adad68cc38bb559c92"}, - {file = "wrapt-1.15.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:74934ebd71950e3db69960a7da29204f89624dde411afbfb3b4858c1409b1e98"}, - {file = "wrapt-1.15.0-cp311-cp311-win32.whl", hash = "sha256:bd84395aab8e4d36263cd1b9308cd504f6cf713b7d6d3ce25ea55670baec5416"}, - {file = "wrapt-1.15.0-cp311-cp311-win_amd64.whl", hash = "sha256:a487f72a25904e2b4bbc0817ce7a8de94363bd7e79890510174da9d901c38705"}, - {file = "wrapt-1.15.0-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:4ff0d20f2e670800d3ed2b220d40984162089a6e2c9646fdb09b85e6f9a8fc29"}, - {file = "wrapt-1.15.0-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:9ed6aa0726b9b60911f4aed8ec5b8dd7bf3491476015819f56473ffaef8959bd"}, - {file = "wrapt-1.15.0-cp35-cp35m-manylinux2010_i686.whl", hash = "sha256:896689fddba4f23ef7c718279e42f8834041a21342d95e56922e1c10c0cc7afb"}, - {file = "wrapt-1.15.0-cp35-cp35m-manylinux2010_x86_64.whl", hash = "sha256:75669d77bb2c071333417617a235324a1618dba66f82a750362eccbe5b61d248"}, - {file = "wrapt-1.15.0-cp35-cp35m-win32.whl", hash = "sha256:fbec11614dba0424ca72f4e8ba3c420dba07b4a7c206c8c8e4e73f2e98f4c559"}, - {file = "wrapt-1.15.0-cp35-cp35m-win_amd64.whl", hash = "sha256:fd69666217b62fa5d7c6aa88e507493a34dec4fa20c5bd925e4bc12fce586639"}, - {file = "wrapt-1.15.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:b0724f05c396b0a4c36a3226c31648385deb6a65d8992644c12a4963c70326ba"}, - {file = "wrapt-1.15.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bbeccb1aa40ab88cd29e6c7d8585582c99548f55f9b2581dfc5ba68c59a85752"}, - {file = "wrapt-1.15.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:38adf7198f8f154502883242f9fe7333ab05a5b02de7d83aa2d88ea621f13364"}, - {file = "wrapt-1.15.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:578383d740457fa790fdf85e6d346fda1416a40549fe8db08e5e9bd281c6a475"}, - {file = "wrapt-1.15.0-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:a4cbb9ff5795cd66f0066bdf5947f170f5d63a9274f99bdbca02fd973adcf2a8"}, - {file = "wrapt-1.15.0-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:af5bd9ccb188f6a5fdda9f1f09d9f4c86cc8a539bd48a0bfdc97723970348418"}, - {file = "wrapt-1.15.0-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:b56d5519e470d3f2fe4aa7585f0632b060d532d0696c5bdfb5e8319e1d0f69a2"}, - {file = "wrapt-1.15.0-cp36-cp36m-win32.whl", hash = "sha256:77d4c1b881076c3ba173484dfa53d3582c1c8ff1f914c6461ab70c8428b796c1"}, - {file = "wrapt-1.15.0-cp36-cp36m-win_amd64.whl", hash = "sha256:077ff0d1f9d9e4ce6476c1a924a3332452c1406e59d90a2cf24aeb29eeac9420"}, - {file = "wrapt-1.15.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:5c5aa28df055697d7c37d2099a7bc09f559d5053c3349b1ad0c39000e611d317"}, - {file = "wrapt-1.15.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3a8564f283394634a7a7054b7983e47dbf39c07712d7b177b37e03f2467a024e"}, - {file = "wrapt-1.15.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:780c82a41dc493b62fc5884fb1d3a3b81106642c5c5c78d6a0d4cbe96d62ba7e"}, - {file = "wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e169e957c33576f47e21864cf3fc9ff47c223a4ebca8960079b8bd36cb014fd0"}, - {file = "wrapt-1.15.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b02f21c1e2074943312d03d243ac4388319f2456576b2c6023041c4d57cd7019"}, - {file = "wrapt-1.15.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:f2e69b3ed24544b0d3dbe2c5c0ba5153ce50dcebb576fdc4696d52aa22db6034"}, - {file = "wrapt-1.15.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d787272ed958a05b2c86311d3a4135d3c2aeea4fc655705f074130aa57d71653"}, - {file = "wrapt-1.15.0-cp37-cp37m-win32.whl", hash = "sha256:02fce1852f755f44f95af51f69d22e45080102e9d00258053b79367d07af39c0"}, - {file = "wrapt-1.15.0-cp37-cp37m-win_amd64.whl", hash = "sha256:abd52a09d03adf9c763d706df707c343293d5d106aea53483e0ec8d9e310ad5e"}, - {file = "wrapt-1.15.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:cdb4f085756c96a3af04e6eca7f08b1345e94b53af8921b25c72f096e704e145"}, - {file = "wrapt-1.15.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:230ae493696a371f1dbffaad3dafbb742a4d27a0afd2b1aecebe52b740167e7f"}, - {file = "wrapt-1.15.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63424c681923b9f3bfbc5e3205aafe790904053d42ddcc08542181a30a7a51bd"}, - {file = "wrapt-1.15.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d6bcbfc99f55655c3d93feb7ef3800bd5bbe963a755687cbf1f490a71fb7794b"}, - {file = "wrapt-1.15.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c99f4309f5145b93eca6e35ac1a988f0dc0a7ccf9ccdcd78d3c0adf57224e62f"}, - {file = "wrapt-1.15.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b130fe77361d6771ecf5a219d8e0817d61b236b7d8b37cc045172e574ed219e6"}, - {file = "wrapt-1.15.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:96177eb5645b1c6985f5c11d03fc2dbda9ad24ec0f3a46dcce91445747e15094"}, - {file = "wrapt-1.15.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d5fe3e099cf07d0fb5a1e23d399e5d4d1ca3e6dfcbe5c8570ccff3e9208274f7"}, - {file = "wrapt-1.15.0-cp38-cp38-win32.whl", hash = "sha256:abd8f36c99512755b8456047b7be10372fca271bf1467a1caa88db991e7c421b"}, - {file = "wrapt-1.15.0-cp38-cp38-win_amd64.whl", hash = "sha256:b06fa97478a5f478fb05e1980980a7cdf2712015493b44d0c87606c1513ed5b1"}, - {file = "wrapt-1.15.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:2e51de54d4fb8fb50d6ee8327f9828306a959ae394d3e01a1ba8b2f937747d86"}, - {file = "wrapt-1.15.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0970ddb69bba00670e58955f8019bec4a42d1785db3faa043c33d81de2bf843c"}, - {file = "wrapt-1.15.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:76407ab327158c510f44ded207e2f76b657303e17cb7a572ffe2f5a8a48aa04d"}, - {file = "wrapt-1.15.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cd525e0e52a5ff16653a3fc9e3dd827981917d34996600bbc34c05d048ca35cc"}, - {file = "wrapt-1.15.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d37ac69edc5614b90516807de32d08cb8e7b12260a285ee330955604ed9dd29"}, - {file = "wrapt-1.15.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:078e2a1a86544e644a68422f881c48b84fef6d18f8c7a957ffd3f2e0a74a0d4a"}, - {file = "wrapt-1.15.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:2cf56d0e237280baed46f0b5316661da892565ff58309d4d2ed7dba763d984b8"}, - {file = "wrapt-1.15.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:7dc0713bf81287a00516ef43137273b23ee414fe41a3c14be10dd95ed98a2df9"}, - {file = "wrapt-1.15.0-cp39-cp39-win32.whl", hash = "sha256:46ed616d5fb42f98630ed70c3529541408166c22cdfd4540b88d5f21006b0eff"}, - {file = "wrapt-1.15.0-cp39-cp39-win_amd64.whl", hash = "sha256:eef4d64c650f33347c1f9266fa5ae001440b232ad9b98f1f43dfe7a79435c0a6"}, - {file = "wrapt-1.15.0-py3-none-any.whl", hash = "sha256:64b1df0f83706b4ef4cfb4fb0e4c2669100fd7ecacfb59e091fad300d4e04640"}, - {file = "wrapt-1.15.0.tar.gz", hash = "sha256:d06730c6aed78cee4126234cf2d071e01b44b915e725a6cb439a879ec9754a3a"}, +python-versions = ">=3.6" +files = [ + {file = "wrapt-1.16.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ffa565331890b90056c01db69c0fe634a776f8019c143a5ae265f9c6bc4bd6d4"}, + {file = "wrapt-1.16.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e4fdb9275308292e880dcbeb12546df7f3e0f96c6b41197e0cf37d2826359020"}, + {file = "wrapt-1.16.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bb2dee3874a500de01c93d5c71415fcaef1d858370d405824783e7a8ef5db440"}, + {file = "wrapt-1.16.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2a88e6010048489cda82b1326889ec075a8c856c2e6a256072b28eaee3ccf487"}, + {file = "wrapt-1.16.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac83a914ebaf589b69f7d0a1277602ff494e21f4c2f743313414378f8f50a4cf"}, + {file = "wrapt-1.16.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:73aa7d98215d39b8455f103de64391cb79dfcad601701a3aa0dddacf74911d72"}, + {file = "wrapt-1.16.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:807cc8543a477ab7422f1120a217054f958a66ef7314f76dd9e77d3f02cdccd0"}, + {file = "wrapt-1.16.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:bf5703fdeb350e36885f2875d853ce13172ae281c56e509f4e6eca049bdfb136"}, + {file = "wrapt-1.16.0-cp310-cp310-win32.whl", hash = "sha256:f6b2d0c6703c988d334f297aa5df18c45e97b0af3679bb75059e0e0bd8b1069d"}, + {file = "wrapt-1.16.0-cp310-cp310-win_amd64.whl", hash = "sha256:decbfa2f618fa8ed81c95ee18a387ff973143c656ef800c9f24fb7e9c16054e2"}, + {file = "wrapt-1.16.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1a5db485fe2de4403f13fafdc231b0dbae5eca4359232d2efc79025527375b09"}, + {file = "wrapt-1.16.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:75ea7d0ee2a15733684badb16de6794894ed9c55aa5e9903260922f0482e687d"}, + {file = "wrapt-1.16.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a452f9ca3e3267cd4d0fcf2edd0d035b1934ac2bd7e0e57ac91ad6b95c0c6389"}, + {file = "wrapt-1.16.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:43aa59eadec7890d9958748db829df269f0368521ba6dc68cc172d5d03ed8060"}, + {file = "wrapt-1.16.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:72554a23c78a8e7aa02abbd699d129eead8b147a23c56e08d08dfc29cfdddca1"}, + {file = "wrapt-1.16.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d2efee35b4b0a347e0d99d28e884dfd82797852d62fcd7ebdeee26f3ceb72cf3"}, + {file = "wrapt-1.16.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:6dcfcffe73710be01d90cae08c3e548d90932d37b39ef83969ae135d36ef3956"}, + {file = "wrapt-1.16.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:eb6e651000a19c96f452c85132811d25e9264d836951022d6e81df2fff38337d"}, + {file = "wrapt-1.16.0-cp311-cp311-win32.whl", hash = "sha256:66027d667efe95cc4fa945af59f92c5a02c6f5bb6012bff9e60542c74c75c362"}, + {file = "wrapt-1.16.0-cp311-cp311-win_amd64.whl", hash = "sha256:aefbc4cb0a54f91af643660a0a150ce2c090d3652cf4052a5397fb2de549cd89"}, + {file = "wrapt-1.16.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:5eb404d89131ec9b4f748fa5cfb5346802e5ee8836f57d516576e61f304f3b7b"}, + {file = "wrapt-1.16.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:9090c9e676d5236a6948330e83cb89969f433b1943a558968f659ead07cb3b36"}, + {file = "wrapt-1.16.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:94265b00870aa407bd0cbcfd536f17ecde43b94fb8d228560a1e9d3041462d73"}, + {file = "wrapt-1.16.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f2058f813d4f2b5e3a9eb2eb3faf8f1d99b81c3e51aeda4b168406443e8ba809"}, + {file = "wrapt-1.16.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98b5e1f498a8ca1858a1cdbffb023bfd954da4e3fa2c0cb5853d40014557248b"}, + {file = "wrapt-1.16.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:14d7dc606219cdd7405133c713f2c218d4252f2a469003f8c46bb92d5d095d81"}, + {file = "wrapt-1.16.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:49aac49dc4782cb04f58986e81ea0b4768e4ff197b57324dcbd7699c5dfb40b9"}, + {file = "wrapt-1.16.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:418abb18146475c310d7a6dc71143d6f7adec5b004ac9ce08dc7a34e2babdc5c"}, + {file = "wrapt-1.16.0-cp312-cp312-win32.whl", hash = "sha256:685f568fa5e627e93f3b52fda002c7ed2fa1800b50ce51f6ed1d572d8ab3e7fc"}, + {file = "wrapt-1.16.0-cp312-cp312-win_amd64.whl", hash = "sha256:dcdba5c86e368442528f7060039eda390cc4091bfd1dca41e8046af7c910dda8"}, + {file = "wrapt-1.16.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:d462f28826f4657968ae51d2181a074dfe03c200d6131690b7d65d55b0f360f8"}, + {file = "wrapt-1.16.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a33a747400b94b6d6b8a165e4480264a64a78c8a4c734b62136062e9a248dd39"}, + {file = "wrapt-1.16.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b3646eefa23daeba62643a58aac816945cadc0afaf21800a1421eeba5f6cfb9c"}, + {file = "wrapt-1.16.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ebf019be5c09d400cf7b024aa52b1f3aeebeff51550d007e92c3c1c4afc2a40"}, + {file = "wrapt-1.16.0-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:0d2691979e93d06a95a26257adb7bfd0c93818e89b1406f5a28f36e0d8c1e1fc"}, + {file = "wrapt-1.16.0-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:1acd723ee2a8826f3d53910255643e33673e1d11db84ce5880675954183ec47e"}, + {file = "wrapt-1.16.0-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:bc57efac2da352a51cc4658878a68d2b1b67dbe9d33c36cb826ca449d80a8465"}, + {file = "wrapt-1.16.0-cp36-cp36m-win32.whl", hash = "sha256:da4813f751142436b075ed7aa012a8778aa43a99f7b36afe9b742d3ed8bdc95e"}, + {file = "wrapt-1.16.0-cp36-cp36m-win_amd64.whl", hash = "sha256:6f6eac2360f2d543cc875a0e5efd413b6cbd483cb3ad7ebf888884a6e0d2e966"}, + {file = "wrapt-1.16.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:a0ea261ce52b5952bf669684a251a66df239ec6d441ccb59ec7afa882265d593"}, + {file = "wrapt-1.16.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7bd2d7ff69a2cac767fbf7a2b206add2e9a210e57947dd7ce03e25d03d2de292"}, + {file = "wrapt-1.16.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9159485323798c8dc530a224bd3ffcf76659319ccc7bbd52e01e73bd0241a0c5"}, + {file = "wrapt-1.16.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a86373cf37cd7764f2201b76496aba58a52e76dedfaa698ef9e9688bfd9e41cf"}, + {file = "wrapt-1.16.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:73870c364c11f03ed072dda68ff7aea6d2a3a5c3fe250d917a429c7432e15228"}, + {file = "wrapt-1.16.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:b935ae30c6e7400022b50f8d359c03ed233d45b725cfdd299462f41ee5ffba6f"}, + {file = "wrapt-1.16.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:db98ad84a55eb09b3c32a96c576476777e87c520a34e2519d3e59c44710c002c"}, + {file = "wrapt-1.16.0-cp37-cp37m-win32.whl", hash = "sha256:9153ed35fc5e4fa3b2fe97bddaa7cbec0ed22412b85bcdaf54aeba92ea37428c"}, + {file = "wrapt-1.16.0-cp37-cp37m-win_amd64.whl", hash = "sha256:66dfbaa7cfa3eb707bbfcd46dab2bc6207b005cbc9caa2199bcbc81d95071a00"}, + {file = "wrapt-1.16.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1dd50a2696ff89f57bd8847647a1c363b687d3d796dc30d4dd4a9d1689a706f0"}, + {file = "wrapt-1.16.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:44a2754372e32ab315734c6c73b24351d06e77ffff6ae27d2ecf14cf3d229202"}, + {file = "wrapt-1.16.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e9723528b9f787dc59168369e42ae1c3b0d3fadb2f1a71de14531d321ee05b0"}, + {file = "wrapt-1.16.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dbed418ba5c3dce92619656802cc5355cb679e58d0d89b50f116e4a9d5a9603e"}, + {file = "wrapt-1.16.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:941988b89b4fd6b41c3f0bfb20e92bd23746579736b7343283297c4c8cbae68f"}, + {file = "wrapt-1.16.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:6a42cd0cfa8ffc1915aef79cb4284f6383d8a3e9dcca70c445dcfdd639d51267"}, + {file = "wrapt-1.16.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:1ca9b6085e4f866bd584fb135a041bfc32cab916e69f714a7d1d397f8c4891ca"}, + {file = "wrapt-1.16.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d5e49454f19ef621089e204f862388d29e6e8d8b162efce05208913dde5b9ad6"}, + {file = "wrapt-1.16.0-cp38-cp38-win32.whl", hash = "sha256:c31f72b1b6624c9d863fc095da460802f43a7c6868c5dda140f51da24fd47d7b"}, + {file = "wrapt-1.16.0-cp38-cp38-win_amd64.whl", hash = "sha256:490b0ee15c1a55be9c1bd8609b8cecd60e325f0575fc98f50058eae366e01f41"}, + {file = "wrapt-1.16.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9b201ae332c3637a42f02d1045e1d0cccfdc41f1f2f801dafbaa7e9b4797bfc2"}, + {file = "wrapt-1.16.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:2076fad65c6736184e77d7d4729b63a6d1ae0b70da4868adeec40989858eb3fb"}, + {file = "wrapt-1.16.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c5cd603b575ebceca7da5a3a251e69561bec509e0b46e4993e1cac402b7247b8"}, + {file = "wrapt-1.16.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b47cfad9e9bbbed2339081f4e346c93ecd7ab504299403320bf85f7f85c7d46c"}, + {file = "wrapt-1.16.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8212564d49c50eb4565e502814f694e240c55551a5f1bc841d4fcaabb0a9b8a"}, + {file = "wrapt-1.16.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:5f15814a33e42b04e3de432e573aa557f9f0f56458745c2074952f564c50e664"}, + {file = "wrapt-1.16.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:db2e408d983b0e61e238cf579c09ef7020560441906ca990fe8412153e3b291f"}, + {file = "wrapt-1.16.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:edfad1d29c73f9b863ebe7082ae9321374ccb10879eeabc84ba3b69f2579d537"}, + {file = "wrapt-1.16.0-cp39-cp39-win32.whl", hash = "sha256:ed867c42c268f876097248e05b6117a65bcd1e63b779e916fe2e33cd6fd0d3c3"}, + {file = "wrapt-1.16.0-cp39-cp39-win_amd64.whl", hash = "sha256:eb1b046be06b0fce7249f1d025cd359b4b80fc1c3e24ad9eca33e0dcdb2e4a35"}, + {file = "wrapt-1.16.0-py3-none-any.whl", hash = "sha256:6906c4100a8fcbf2fa735f6059214bb13b97f75b1a61777fcf6432121ef12ef1"}, + {file = "wrapt-1.16.0.tar.gz", hash = "sha256:5f370f952971e7d17c7d1ead40e49f32345a7f7a5373571ef44d800d06b1899d"}, ] [[package]] name = "yamllint" -version = "1.32.0" +version = "1.33.0" description = "A linter for YAML files." optional = false -python-versions = ">=3.7" +python-versions = ">=3.8" files = [ - {file = "yamllint-1.32.0-py3-none-any.whl", hash = "sha256:d97a66e48da820829d96077d76b8dfbe6c6140f106e558dae87e81ac4e6b30b7"}, - {file = "yamllint-1.32.0.tar.gz", hash = "sha256:d01dde008c65de5b235188ab3110bebc59d18e5c65fc8a58267cd211cd9df34a"}, + {file = "yamllint-1.33.0-py3-none-any.whl", hash = "sha256:28a19f5d68d28d8fec538a1db21bb2d84c7dc2e2ea36266da8d4d1c5a683814d"}, + {file = "yamllint-1.33.0.tar.gz", hash = "sha256:2dceab9ef2d99518a2fcf4ffc964d44250ac4459be1ba3ca315118e4a1a81f7d"}, ] [package.dependencies] @@ -4638,4 +4666,4 @@ testing = ["coverage (>=5.0.3)", "zope.event", "zope.testing"] [metadata] lock-version = "2.0" python-versions = "^3.8, < 3.12" -content-hash = "cb7d601e824197389e1d89421aa4930b690959caacc279fb1d123f6d86be0c7d" +content-hash = "af3ae745d04c78b74b6f6f2b83f5bf9f52fae58deb21988a8ecfc46429940607" diff --git a/pyproject.toml b/pyproject.toml index f6dcfc55b5..f6fa6e8b7b 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -13,6 +13,7 @@ packages = [ [tool.poetry.dependencies] python = "^3.8, < 3.12" infrahub-sdk = {path = "python_sdk", develop=true} +pydantic = "^1.10" [tool.poetry.group.server.dependencies] fastapi = "~0.95" @@ -49,7 +50,6 @@ requests = "*" pre-commit = "^2.20.0" autoflake = "*" pytest-clarity = "^1.0.1" -pytest-httpx = "^0.22" types-toml = "*" types-ujson = "*" types-pyyaml = "*" From 8dc7f96055b84d3f1f8e62baf089123fa4943543 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Thu, 16 Nov 2023 09:36:10 +0100 Subject: [PATCH 049/446] Add missing import for ctl and fix incorrect description --- tasks/__init__.py | 3 ++- tasks/docs.py | 2 +- 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/tasks/__init__.py b/tasks/__init__.py index b8e8dbac38..e40ba3daad 100644 --- a/tasks/__init__.py +++ b/tasks/__init__.py @@ -2,9 +2,10 @@ from invoke import Collection, Context, task -from . import backend, demo, docs, main, performance, sdk, sync, test +from . import backend, ctl, demo, docs, main, performance, sdk, sync, test ns = Collection() +ns.add_collection(ctl) ns.add_collection(sdk) ns.add_collection(docs) ns.add_collection(performance) diff --git a/tasks/docs.py b/tasks/docs.py index 4da8db7758..57b9264674 100644 --- a/tasks/docs.py +++ b/tasks/docs.py @@ -7,7 +7,7 @@ @task def build(context: Context): - """Run documentation server in development mode.""" + """Build documentation website.""" exec_cmd = "npx retype build docs" with context.cd(ESCAPED_REPO_PATH): output = context.run(exec_cmd) From b3bb41b257be66c5b8bd42c62cf13a861133e440 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Thu, 16 Nov 2023 10:10:54 +0100 Subject: [PATCH 050/446] Add InfrahubLogger object to services --- backend/infrahub/services/__init__.py | 4 ++++ backend/infrahub/services/protocols.py | 21 +++++++++++++++++++++ 2 files changed, 25 insertions(+) create mode 100644 backend/infrahub/services/protocols.py diff --git a/backend/infrahub/services/__init__.py b/backend/infrahub/services/__init__.py index 787a942bc3..b27f5c4de4 100644 --- a/backend/infrahub/services/__init__.py +++ b/backend/infrahub/services/__init__.py @@ -4,12 +4,14 @@ from infrahub.database import InfrahubDatabase from infrahub.exceptions import InitializationError +from infrahub.log import get_logger from infrahub.message_bus import InfrahubMessage, InfrahubResponse, Meta from infrahub.message_bus.messages import ROUTING_KEY_MAP from infrahub.message_bus.types import MessageTTL from .adapters.cache import InfrahubCache from .adapters.message_bus import InfrahubMessageBus +from .protocols import InfrahubLogger class InfrahubServices: @@ -19,11 +21,13 @@ def __init__( client: Optional[InfrahubClient] = None, database: Optional[InfrahubDatabase] = None, message_bus: Optional[InfrahubMessageBus] = None, + log: Optional[InfrahubLogger] = None, ): self.cache = cache or InfrahubCache() self._client = client self._database = database self.message_bus = message_bus or InfrahubMessageBus() + self.log = log or get_logger() @property def client(self) -> InfrahubClient: diff --git a/backend/infrahub/services/protocols.py b/backend/infrahub/services/protocols.py new file mode 100644 index 0000000000..4bd2a09030 --- /dev/null +++ b/backend/infrahub/services/protocols.py @@ -0,0 +1,21 @@ +from typing import Any, Optional, Protocol + + +class InfrahubLogger(Protocol): + def debug(self, event: Optional[str] = None, *args: Any, **kw: Any) -> Any: + """Send a debug event""" + + def info(self, event: Optional[str] = None, *args: Any, **kw: Any) -> Any: + """Send an info event""" + + def warning(self, event: Optional[str] = None, *args: Any, **kw: Any) -> Any: + """Send a warning event""" + + def error(self, event: Optional[str] = None, *args: Any, **kw: Any) -> Any: + """Send an error event.""" + + def critical(self, event: Optional[str] = None, *args: Any, **kw: Any) -> Any: + """Send a critical event.""" + + def exception(self, event: Optional[str] = None, *args: Any, **kw: Any) -> Any: + """Send an exception event.""" From 8fde5e858f5e0248f526c9963db7266af9a987f5 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Thu, 16 Nov 2023 11:13:43 +0100 Subject: [PATCH 051/446] Move checks and transforms to SDK For now the old modules remain and raise deprecation warnings, we can remove these later when the repositories are up to date. Fixes #1117 --- backend/infrahub/checks.py | 133 +----------------- backend/infrahub/git/repository.py | 4 +- backend/infrahub/transforms.py | 90 ++---------- backend/tests/fixtures/checks/check01.py | 2 +- backend/tests/fixtures/checks/check02.py | 2 +- .../checks/check_spine_interface_status.py | 2 +- .../test_base/unit/transforms/multiplier.py | 2 +- .../tests/fixtures/transforms/transform01.py | 2 +- .../tests/fixtures/transforms/transform02.py | 3 +- docs/topics/transformation.md | 2 +- python_sdk/infrahub_sdk/checks.py | 131 +++++++++++++++++ python_sdk/infrahub_sdk/transforms.py | 86 +++++++++++ .../tests/unit/sdk}/checks/__init__.py | 0 .../tests/unit/sdk}/checks/conftest.py | 3 +- .../tests/unit/sdk}/checks/test_checks.py | 4 +- 15 files changed, 244 insertions(+), 222 deletions(-) create mode 100644 python_sdk/infrahub_sdk/checks.py create mode 100644 python_sdk/infrahub_sdk/transforms.py rename {backend/tests/unit => python_sdk/tests/unit/sdk}/checks/__init__.py (100%) rename {backend/tests/unit => python_sdk/tests/unit/sdk}/checks/conftest.py (99%) rename {backend/tests/unit => python_sdk/tests/unit/sdk}/checks/test_checks.py (96%) diff --git a/backend/infrahub/checks.py b/backend/infrahub/checks.py index 4bbec80ae7..39921d48d6 100644 --- a/backend/infrahub/checks.py +++ b/backend/infrahub/checks.py @@ -1,130 +1,9 @@ -from __future__ import annotations +from warnings import warn -import asyncio -import json -import os -from abc import abstractmethod -from typing import Any, Dict, List, Optional +from infrahub_sdk.checks import INFRAHUB_CHECK_VARIABLE_TO_IMPORT, InfrahubCheck -from git.repo import Repo -from infrahub_sdk import InfrahubClient +warn( + f"The module {__name__} is deprecated. Update to use infrahub_sdk.checks instead.", DeprecationWarning, stacklevel=2 +) -INFRAHUB_CHECK_VARIABLE_TO_IMPORT = "INFRAHUB_CHECKS" - - -class InfrahubCheck: - name: Optional[str] = None - query: str = "" - timeout: int = 10 - rebase: bool = True - - def __init__(self, branch: str = "", root_directory: str = "", output: Optional[str] = None): - self.data = None - self.git: Optional[Repo] = None - - self.logs: List[Dict[str, Any]] = [] - self.passed = False - - self.output = output - - self.branch = branch - - self.root_directory = root_directory or os.getcwd() - - self.client: InfrahubClient - - if not self.name: - self.name = self.__class__.__name__ - - if not self.query: - raise ValueError("A query must be provided") - - @classmethod - async def init(cls, client: Optional[InfrahubClient] = None, *args: Any, **kwargs: Any) -> InfrahubCheck: - """Async init method, If an existing InfrahubClient client hasn't been provided, one will be created automatically.""" - - instance = cls(*args, **kwargs) - instance.client = client or InfrahubClient() - - return instance - - @property - def errors(self) -> List[Dict[str, Any]]: - return [log for log in self.logs if log["level"] == "ERROR"] - - def _write_log_entry( - self, message: str, level: str, object_id: Optional[str] = None, object_type: Optional[str] = None - ) -> None: - log_message = {"level": level, "message": message, "branch": self.branch_name} - if object_id: - log_message["object_id"] = object_id - if object_type: - log_message["object_type"] = object_type - self.logs.append(log_message) - - if self.output == "stdout": - print(json.dumps(log_message)) - - def log_error(self, message: str, object_id: Optional[str] = None, object_type: Optional[str] = None) -> None: - self._write_log_entry(message=message, level="ERROR", object_id=object_id, object_type=object_type) - - def log_info(self, message: str, object_id: Optional[str] = None, object_type: Optional[str] = None) -> None: - self._write_log_entry(message=message, level="INFO", object_id=object_id, object_type=object_type) - - @property - def log_entries(self) -> str: - output = "" - for log in self.logs: - output += "-----------------------\n" - output += f"Message: {log['message']}\n" - output += f"Level: {log['level']}\n" - if "object_id" in log: - output += f"Object ID: {log['object_id']}\n" - if "object_type" in log: - output += f"Object ID: {log['object_type']}\n" - return output - - @property - def branch_name(self) -> str: - """Return the name of the current git branch.""" - - if self.branch: - return self.branch - - if not self.git: - self.git = Repo(self.root_directory) - - self.branch = str(self.git.active_branch) - - return self.branch - - @abstractmethod - def validate(self) -> None: - """Code to validate the status of this check.""" - - async def collect_data(self) -> None: - """Query the result of the GraphQL Query defined in sef.query and store the result in self.data""" - - data = await self.client.query_gql_query(name=self.query, branch_name=self.branch_name, rebase=self.rebase) - self.data = data - - async def run(self) -> bool: - """Execute the check after collecting the data from the GraphQL query. - The result of the check is determined based on the presence or not of ERROR log messages.""" - - await self.collect_data() - - validate_method = getattr(self, "validate") - if asyncio.iscoroutinefunction(validate_method): - await validate_method() - else: - validate_method() - - nbr_errors = len([log for log in self.logs if log["level"] == "ERROR"]) - - self.passed = bool(nbr_errors == 0) - - if self.passed: - self.log_info("Check succesfully completed") - - return self.passed +__all__ = ["INFRAHUB_CHECK_VARIABLE_TO_IMPORT", "InfrahubCheck"] diff --git a/backend/infrahub/git/repository.py b/backend/infrahub/git/repository.py index 4f203badef..5c0b256edd 100644 --- a/backend/infrahub/git/repository.py +++ b/backend/infrahub/git/repository.py @@ -24,12 +24,13 @@ InfrahubRepositoryConfig, ValidationError, ) +from infrahub_sdk.checks import INFRAHUB_CHECK_VARIABLE_TO_IMPORT, InfrahubCheck +from infrahub_sdk.transforms import INFRAHUB_TRANSFORM_VARIABLE_TO_IMPORT from infrahub_sdk.utils import YamlFile, compare_lists from pydantic import BaseModel, validator from pydantic import ValidationError as PydanticValidationError import infrahub.config as config -from infrahub.checks import INFRAHUB_CHECK_VARIABLE_TO_IMPORT, InfrahubCheck from infrahub.exceptions import ( CheckError, CommitNotFoundError, @@ -39,7 +40,6 @@ TransformError, ) from infrahub.log import get_logger -from infrahub.transforms import INFRAHUB_TRANSFORM_VARIABLE_TO_IMPORT if TYPE_CHECKING: from infrahub_sdk.branch import BranchData diff --git a/backend/infrahub/transforms.py b/backend/infrahub/transforms.py index e6ba254d07..236d1d12f8 100644 --- a/backend/infrahub/transforms.py +++ b/backend/infrahub/transforms.py @@ -1,85 +1,11 @@ -from __future__ import annotations +from warnings import warn -import asyncio -import os -from abc import abstractmethod -from typing import Any, Dict, Optional +from infrahub_sdk.transforms import INFRAHUB_TRANSFORM_VARIABLE_TO_IMPORT, InfrahubTransform -from git import Repo -from infrahub_sdk import InfrahubClient +warn( + f"The module {__name__} is deprecated. Update to use infrahub_sdk.transforms instead.", + DeprecationWarning, + stacklevel=2, +) -INFRAHUB_TRANSFORM_VARIABLE_TO_IMPORT = "INFRAHUB_TRANSFORMS" - - -class InfrahubTransform: - name: Optional[str] = None - query: str - url: str - timeout: int = 10 - rebase: bool = True - - def __init__(self, branch: str = "", root_directory: str = "", server_url: str = ""): - self.data = None - self.git: Repo - - self.branch = branch - - self.server_url = server_url or os.environ.get("INFRAHUB_URL", "http://127.0.0.1:8000") - self.root_directory = root_directory or os.getcwd() - - self.client: InfrahubClient - - if not self.name: - self.name = self.__class__.__name__ - - if not self.query: - raise ValueError("A query must be provided") - if not self.url: - raise ValueError("A url must be provided") - - @classmethod - async def init(cls, client: Optional[InfrahubClient] = None, *args: Any, **kwargs: Any) -> InfrahubTransform: - """Async init method, If an existing InfrahubClient client hasn't been provided, one will be created automatically.""" - - item = cls(*args, **kwargs) - - if client: - item.client = client - else: - item.client = await InfrahubClient.init(address=item.server_url) - - return item - - @property - def branch_name(self) -> str: - """Return the name of the current git branch.""" - - if self.branch: - return self.branch - - if not self.git: - self.git = Repo(self.root_directory) - self.branch = str(self.git.active_branch) - - return self.branch - - @abstractmethod - def transform(self, data: dict) -> Any: - pass - - async def collect_data(self) -> Dict: - """Query the result of the GraphQL Query defined in sef.query and return the result""" - - return await self.client.query_gql_query(name=self.query, branch_name=self.branch_name, rebase=self.rebase) - - async def run(self, data: Optional[dict] = None) -> Any: - """Execute the transformation after collecting the data from the GraphQL query. - The result of the check is determined based on the presence or not of ERROR log messages.""" - - if not data: - data = await self.collect_data() - - if asyncio.iscoroutinefunction(self.transform): - return await self.transform(data=data) - - return self.transform(data=data) +__all__ = ["INFRAHUB_TRANSFORM_VARIABLE_TO_IMPORT", "InfrahubTransform"] diff --git a/backend/tests/fixtures/checks/check01.py b/backend/tests/fixtures/checks/check01.py index 3c52eb9bf8..42c2c7f788 100644 --- a/backend/tests/fixtures/checks/check01.py +++ b/backend/tests/fixtures/checks/check01.py @@ -1,4 +1,4 @@ -from infrahub.checks import InfrahubCheck +from infrahub_sdk.checks import InfrahubCheck class Check01(InfrahubCheck): diff --git a/backend/tests/fixtures/checks/check02.py b/backend/tests/fixtures/checks/check02.py index 70d67e6819..462b27a0eb 100644 --- a/backend/tests/fixtures/checks/check02.py +++ b/backend/tests/fixtures/checks/check02.py @@ -1,4 +1,4 @@ -from infrahub.checks import InfrahubCheck +from infrahub_sdk.checks import InfrahubCheck class Check02(InfrahubCheck): diff --git a/backend/tests/fixtures/project_02/checks/check_spine_interface_status.py b/backend/tests/fixtures/project_02/checks/check_spine_interface_status.py index 0330a8501b..75b2d7e119 100644 --- a/backend/tests/fixtures/project_02/checks/check_spine_interface_status.py +++ b/backend/tests/fixtures/project_02/checks/check_spine_interface_status.py @@ -1,4 +1,4 @@ -from infrahub.checks import InfrahubCheck +from infrahub_sdk.checks import InfrahubCheck class InfrahubCheckSpineNbrInterfaceDisabled(InfrahubCheck): diff --git a/backend/tests/fixtures/repos/test_base/unit/transforms/multiplier.py b/backend/tests/fixtures/repos/test_base/unit/transforms/multiplier.py index 2f0b73e806..7ac210ea67 100644 --- a/backend/tests/fixtures/repos/test_base/unit/transforms/multiplier.py +++ b/backend/tests/fixtures/repos/test_base/unit/transforms/multiplier.py @@ -1,6 +1,6 @@ from typing import Any, Dict -from infrahub.transforms import InfrahubTransform +from infrahub_sdk.transforms import InfrahubTransform class Multiplier(InfrahubTransform): diff --git a/backend/tests/fixtures/transforms/transform01.py b/backend/tests/fixtures/transforms/transform01.py index 369e99c060..4b2ee61a59 100644 --- a/backend/tests/fixtures/transforms/transform01.py +++ b/backend/tests/fixtures/transforms/transform01.py @@ -1,4 +1,4 @@ -from infrahub.transforms import InfrahubTransform +from infrahub_sdk.transforms import InfrahubTransform class Transform01(InfrahubTransform): diff --git a/backend/tests/fixtures/transforms/transform02.py b/backend/tests/fixtures/transforms/transform02.py index 2f8c571b1e..d88e7d0c59 100644 --- a/backend/tests/fixtures/transforms/transform02.py +++ b/backend/tests/fixtures/transforms/transform02.py @@ -1,9 +1,8 @@ -from infrahub.transforms import InfrahubTransform +from infrahub_sdk.transforms import InfrahubTransform class Transform02(InfrahubTransform): query = "my_query" - # url = "transform01" def transform(self, data: dict): return {str(key).upper(): value for key, value in data.items()} diff --git a/docs/topics/transformation.md b/docs/topics/transformation.md index d183e2514b..4fcdd375f4 100644 --- a/docs/topics/transformation.md +++ b/docs/topics/transformation.md @@ -141,7 +141,7 @@ Each TransformPython must also define as Class level variables: - `url` : The URL where this TransformPython will be exposed via the REST API ```python -from infrahub.transforms import InfrahubTransform +from infrahub_sdk.transforms import InfrahubTransform class MyPythonTransformation(InfrahubTransform): diff --git a/python_sdk/infrahub_sdk/checks.py b/python_sdk/infrahub_sdk/checks.py new file mode 100644 index 0000000000..023a306989 --- /dev/null +++ b/python_sdk/infrahub_sdk/checks.py @@ -0,0 +1,131 @@ +from __future__ import annotations + +import asyncio +import json +import os +from abc import abstractmethod +from typing import Any, Dict, List, Optional + +from git.repo import Repo + +from infrahub_sdk import InfrahubClient + +INFRAHUB_CHECK_VARIABLE_TO_IMPORT = "INFRAHUB_CHECKS" + + +class InfrahubCheck: + name: Optional[str] = None + query: str = "" + timeout: int = 10 + rebase: bool = True + + def __init__(self, branch: str = "", root_directory: str = "", output: Optional[str] = None): + self.data: Dict = {} + self.git: Optional[Repo] = None + + self.logs: List[Dict[str, Any]] = [] + self.passed = False + + self.output = output + + self.branch = branch + + self.root_directory = root_directory or os.getcwd() + + self.client: InfrahubClient + + if not self.name: + self.name = self.__class__.__name__ + + if not self.query: + raise ValueError("A query must be provided") + + @classmethod + async def init(cls, client: Optional[InfrahubClient] = None, *args: Any, **kwargs: Any) -> InfrahubCheck: + """Async init method, If an existing InfrahubClient client hasn't been provided, one will be created automatically.""" + + instance = cls(*args, **kwargs) + instance.client = client or InfrahubClient() + + return instance + + @property + def errors(self) -> List[Dict[str, Any]]: + return [log for log in self.logs if log["level"] == "ERROR"] + + def _write_log_entry( + self, message: str, level: str, object_id: Optional[str] = None, object_type: Optional[str] = None + ) -> None: + log_message = {"level": level, "message": message, "branch": self.branch_name} + if object_id: + log_message["object_id"] = object_id + if object_type: + log_message["object_type"] = object_type + self.logs.append(log_message) + + if self.output == "stdout": + print(json.dumps(log_message)) + + def log_error(self, message: str, object_id: Optional[str] = None, object_type: Optional[str] = None) -> None: + self._write_log_entry(message=message, level="ERROR", object_id=object_id, object_type=object_type) + + def log_info(self, message: str, object_id: Optional[str] = None, object_type: Optional[str] = None) -> None: + self._write_log_entry(message=message, level="INFO", object_id=object_id, object_type=object_type) + + @property + def log_entries(self) -> str: + output = "" + for log in self.logs: + output += "-----------------------\n" + output += f"Message: {log['message']}\n" + output += f"Level: {log['level']}\n" + if "object_id" in log: + output += f"Object ID: {log['object_id']}\n" + if "object_type" in log: + output += f"Object ID: {log['object_type']}\n" + return output + + @property + def branch_name(self) -> str: + """Return the name of the current git branch.""" + + if self.branch: + return self.branch + + if not self.git: + self.git = Repo(self.root_directory) + + self.branch = str(self.git.active_branch) + + return self.branch + + @abstractmethod + def validate(self) -> None: + """Code to validate the status of this check.""" + + async def collect_data(self) -> None: + """Query the result of the GraphQL Query defined in sef.query and store the result in self.data""" + + data = await self.client.query_gql_query(name=self.query, branch_name=self.branch_name, rebase=self.rebase) + self.data = data + + async def run(self) -> bool: + """Execute the check after collecting the data from the GraphQL query. + The result of the check is determined based on the presence or not of ERROR log messages.""" + + await self.collect_data() + + validate_method = getattr(self, "validate") + if asyncio.iscoroutinefunction(validate_method): + await validate_method() + else: + validate_method() + + nbr_errors = len([log for log in self.logs if log["level"] == "ERROR"]) + + self.passed = bool(nbr_errors == 0) + + if self.passed: + self.log_info("Check succesfully completed") + + return self.passed diff --git a/python_sdk/infrahub_sdk/transforms.py b/python_sdk/infrahub_sdk/transforms.py new file mode 100644 index 0000000000..5841e81746 --- /dev/null +++ b/python_sdk/infrahub_sdk/transforms.py @@ -0,0 +1,86 @@ +from __future__ import annotations + +import asyncio +import os +from abc import abstractmethod +from typing import Any, Dict, Optional + +from git import Repo + +from infrahub_sdk import InfrahubClient + +INFRAHUB_TRANSFORM_VARIABLE_TO_IMPORT = "INFRAHUB_TRANSFORMS" + + +class InfrahubTransform: + name: Optional[str] = None + query: str + url: str + timeout: int = 10 + rebase: bool = True + + def __init__(self, branch: str = "", root_directory: str = "", server_url: str = ""): + self.data = None + self.git: Repo + + self.branch = branch + + self.server_url = server_url or os.environ.get("INFRAHUB_URL", "http://127.0.0.1:8000") + self.root_directory = root_directory or os.getcwd() + + self.client: InfrahubClient + + if not self.name: + self.name = self.__class__.__name__ + + if not self.query: + raise ValueError("A query must be provided") + if not self.url: + raise ValueError("A url must be provided") + + @classmethod + async def init(cls, client: Optional[InfrahubClient] = None, *args: Any, **kwargs: Any) -> InfrahubTransform: + """Async init method, If an existing InfrahubClient client hasn't been provided, one will be created automatically.""" + + item = cls(*args, **kwargs) + + if client: + item.client = client + else: + item.client = await InfrahubClient.init(address=item.server_url) + + return item + + @property + def branch_name(self) -> str: + """Return the name of the current git branch.""" + + if self.branch: + return self.branch + + if not self.git: + self.git = Repo(self.root_directory) + self.branch = str(self.git.active_branch) + + return self.branch + + @abstractmethod + def transform(self, data: dict) -> Any: + pass + + async def collect_data(self) -> Dict: + """Query the result of the GraphQL Query defined in sef.query and return the result""" + + return await self.client.query_gql_query(name=self.query, branch_name=self.branch_name, rebase=self.rebase) + + async def run(self, data: Optional[dict] = None) -> Any: + """Execute the transformation after collecting the data from the GraphQL query. + The result of the check is determined based on the presence or not of ERROR log messages.""" + + if not data: + data = await self.collect_data() + + if asyncio.iscoroutinefunction(self.transform): + return await self.transform(data=data) + + return self.transform(data=data) diff --git a/backend/tests/unit/checks/__init__.py b/python_sdk/tests/unit/sdk/checks/__init__.py similarity index 100% rename from backend/tests/unit/checks/__init__.py rename to python_sdk/tests/unit/sdk/checks/__init__.py diff --git a/backend/tests/unit/checks/conftest.py b/python_sdk/tests/unit/sdk/checks/conftest.py similarity index 99% rename from backend/tests/unit/checks/conftest.py rename to python_sdk/tests/unit/sdk/checks/conftest.py index 595a0c7ae9..2ec879f80b 100644 --- a/backend/tests/unit/checks/conftest.py +++ b/python_sdk/tests/unit/sdk/checks/conftest.py @@ -1,7 +1,8 @@ import pytest -from infrahub_sdk import InfrahubClient from pytest_httpx import HTTPXMock +from infrahub_sdk import InfrahubClient + @pytest.fixture async def client() -> InfrahubClient: diff --git a/backend/tests/unit/checks/test_checks.py b/python_sdk/tests/unit/sdk/checks/test_checks.py similarity index 96% rename from backend/tests/unit/checks/test_checks.py rename to python_sdk/tests/unit/sdk/checks/test_checks.py index d4839af840..b0f0bc4f86 100644 --- a/backend/tests/unit/checks/test_checks.py +++ b/python_sdk/tests/unit/sdk/checks/test_checks.py @@ -1,7 +1,7 @@ import pytest -from infrahub_sdk import InfrahubClient -from infrahub.checks import InfrahubCheck +from infrahub_sdk import InfrahubClient +from infrahub_sdk.checks import InfrahubCheck async def test_class_init(): From bff3f384357876cdf06490bc64d38149ff830f6b Mon Sep 17 00:00:00 2001 From: pa-lem Date: Thu, 16 Nov 2023 11:34:18 +0100 Subject: [PATCH 052/446] mark read only attributes as protected --- frontend/src/infraops.d.ts | 2 ++ .../edit-form-hook/dynamic-control-types.ts | 1 + .../src/utils/formStructureForCreateEdit.ts | 25 ++++++++++++------- 3 files changed, 19 insertions(+), 9 deletions(-) diff --git a/frontend/src/infraops.d.ts b/frontend/src/infraops.d.ts index eadc4a93d4..1770bd298c 100644 --- a/frontend/src/infraops.d.ts +++ b/frontend/src/infraops.d.ts @@ -74,6 +74,7 @@ export interface components { }; /** AttributeSchema */ AttributeSchema: { + read_only: boolean | undefined; /** Id */ id?: string; /** Name */ @@ -411,6 +412,7 @@ export interface components { RelationshipKind: "Generic" | "Attribute" | "Component" | "Parent"; /** RelationshipSchema */ RelationshipSchema: { + read_only: any; /** Id */ id?: string; /** Name */ diff --git a/frontend/src/screens/edit-form-hook/dynamic-control-types.ts b/frontend/src/screens/edit-form-hook/dynamic-control-types.ts index 3884095af2..f86ce9839d 100644 --- a/frontend/src/screens/edit-form-hook/dynamic-control-types.ts +++ b/frontend/src/screens/edit-form-hook/dynamic-control-types.ts @@ -94,5 +94,6 @@ export interface DynamicFieldData { error?: FormFieldError; isProtected?: boolean; isOptionnal?: boolean; + isReadOnly?: boolean; disabled?: boolean; } diff --git a/frontend/src/utils/formStructureForCreateEdit.ts b/frontend/src/utils/formStructureForCreateEdit.ts index ca78975d3c..5025b995d6 100644 --- a/frontend/src/utils/formStructureForCreateEdit.ts +++ b/frontend/src/utils/formStructureForCreateEdit.ts @@ -9,7 +9,10 @@ import { } from "../screens/edit-form-hook/dynamic-control-types"; import { iGenericSchema, iNodeSchema } from "../state/atoms/schema.atom"; -const getIsDisabled = (owner?: any, user?: any, isProtected?: boolean) => { +const getIsDisabled = ({ owner, user, isProtected, isReadOnly }: any) => { + // Field is read only + if (isReadOnly) return true; + // Field is available if there is no owner and if is_protected is not set to true if (!isProtected || !owner || user?.permissions?.isAdmin) return false; @@ -114,11 +117,13 @@ const getFormStructureForCreateEdit = ( validate: (value: any) => validate(value, attribute, attribute.optional), }, isOptionnal: attribute.optional, - isProtected: getIsDisabled( - row && row[attribute.name]?.owner, + isReadOnly: attribute.read_only, + isProtected: getIsDisabled({ + owner: row && row[attribute.name]?.owner, user, - row && row[attribute.name] && row[attribute.name].is_protected - ), + isProtected: row && row[attribute.name] && row[attribute.name].is_protected, + isReadOnly: attribute.read_only, + }), }); }); @@ -191,11 +196,13 @@ const getFormStructureForCreateEdit = ( validate: (value: any) => validate(value, undefined, relationship.optional), }, isOptionnal: relationship.optional, - isProtected: getIsDisabled( - row && row[relationship.name]?.properties?.owner, + isProtected: getIsDisabled({ + owner: row && row[relationship.name]?.properties?.owner, user, - row && row[relationship.name] && row[relationship.name]?.properties?.is_protected - ), + isProtected: + row && row[relationship.name] && row[relationship.name]?.properties?.is_protected, + isReadOnly: relationship.read_only, + }), }); }); From 3e8ea10382346f0427a570f5bf7c0eeb10a68367 Mon Sep 17 00:00:00 2001 From: Bilal Date: Thu, 16 Nov 2023 11:44:11 +0100 Subject: [PATCH 053/446] constructPath is able to delete and update qsp --- .../src/screens/branches/branch-details.tsx | 13 ++--- frontend/src/screens/diff/checks/conflict.tsx | 5 +- frontend/src/utils/fetch.ts | 48 +++++++++---------- 3 files changed, 34 insertions(+), 32 deletions(-) diff --git a/frontend/src/screens/branches/branch-details.tsx b/frontend/src/screens/branches/branch-details.tsx index 4f59d4f675..bf50045e02 100644 --- a/frontend/src/screens/branches/branch-details.tsx +++ b/frontend/src/screens/branches/branch-details.tsx @@ -33,7 +33,8 @@ import ErrorScreen from "../error-screen/error-screen"; import LoadingScreen from "../loading-screen/loading-screen"; import ObjectItemCreate from "../object-item-create/object-item-create-paginated"; import { getFormStructure } from "../proposed-changes/conversations"; -import { getCurrentQsp } from "../../utils/fetch"; +import { constructPath, getCurrentQsp } from "../../utils/fetch"; +import { QSP } from "../../config/qsp"; export const BranchDetails = () => { const { branchname } = useParams(); @@ -148,13 +149,13 @@ export const BranchDetails = () => { }); const queryStringParams = getCurrentQsp(); - const isDeletedBranchSelected = queryStringParams.get("branch") === branch.name; + const isDeletedBranchSelected = queryStringParams.get(QSP.BRANCH) === branch.name; - if (isDeletedBranchSelected) { - queryStringParams.delete("branch"); // back to main branch - } + const path = isDeletedBranchSelected + ? constructPath("/branches", [{ name: QSP.BRANCH, exclude: true }]) + : constructPath("/branches"); - navigate("/branches?" + queryStringParams.toString()); + navigate(path); window.location.reload(); }} diff --git a/frontend/src/screens/diff/checks/conflict.tsx b/frontend/src/screens/diff/checks/conflict.tsx index e89f39bc7d..97092f874a 100644 --- a/frontend/src/screens/diff/checks/conflict.tsx +++ b/frontend/src/screens/diff/checks/conflict.tsx @@ -19,6 +19,7 @@ import { constructPath } from "../../../utils/fetch"; import { getObjectDetailsUrl } from "../../../utils/objects"; import { stringifyWithoutQuotes } from "../../../utils/string"; import { getNodeClassName } from "../data-diff-node"; +import { QSP } from "../../../config/qsp"; const renderConflict = { attribute_value: (name: string) => { @@ -142,7 +143,9 @@ export const Conflict = (props: any) => { value: change, }; - const url = constructPath(getObjectDetailsUrl(node_id, kind), [["branch", branch]]); + const url = constructPath(getObjectDetailsUrl(node_id, kind), [ + { name: QSP.BRANCH, value: branch }, + ]); const isSelected = (keep_branch?.value === "target" && branch === "main") || diff --git a/frontend/src/utils/fetch.ts b/frontend/src/utils/fetch.ts index 06da96ea14..cce238ba70 100644 --- a/frontend/src/utils/fetch.ts +++ b/frontend/src/utils/fetch.ts @@ -51,34 +51,32 @@ export const fetchStream = async (url: string, payload?: any) => { const QSP_TO_INCLUDE = [QSP.BRANCH]; -const getParams = (params: [string, string][], overrideParams?: [string, string][]) => { - if (overrideParams?.length) { - return overrideParams; - } - - if (params?.length) { - return params; - } - - return []; +type overrideQueryParams = { + name: string; + value?: string; + exclude?: boolean; }; // Construct link with path that contains all QSP -export const constructPath = (path: string, overrideParams?: [string, string][]) => { - const targetParams = getCurrentQsp(); - - // Get QSP as [ [ key, value ], ... ] - const params: [string, string][] = Array.from(targetParams).filter( - ([k]) => QSP_TO_INCLUDE.includes(k) // Remove some QSP if not needed to be forwarded - ); - - // Construct the new params as "?key=value&..." - const newParams = getParams(params, overrideParams).reduce( - (acc, [k, v], index) => `${acc}${k}=${v}${index === params.length - 1 ? "" : "&"}`, - "?" - ); - - return `${path}${newParams}`; +export const constructPath = (path: string, overrideParams?: overrideQueryParams[]) => { + const currentURLSearchParams = getCurrentQsp(); + const newURLSearchParams = new URLSearchParams(); + + // Remove some QSP if not needed to be forwarded + QSP_TO_INCLUDE.forEach((qsp) => { + const paramValue = currentURLSearchParams.get(qsp); + if (paramValue) newURLSearchParams.set(qsp, paramValue); + }); + + overrideParams?.forEach(({ name, value, exclude }) => { + if (exclude) { + newURLSearchParams.delete(name); + } else if (value) { + newURLSearchParams.set(name, value); + } + }); + + return `${path}?${newURLSearchParams.toString()}`; }; export const getCurrentQsp = () => new URL(window.location.href).searchParams; From 4e59f3f5316c50fcf98ba6fe27db7aef62eeb277 Mon Sep 17 00:00:00 2001 From: pa-lem Date: Thu, 16 Nov 2023 12:28:28 +0100 Subject: [PATCH 054/446] remove read only attributes from mutation --- .../object-item-edit/object-item-edit-paginated.tsx | 9 +++------ frontend/src/utils/getMutationDetailsFromFormData.ts | 5 +++++ 2 files changed, 8 insertions(+), 6 deletions(-) diff --git a/frontend/src/screens/object-item-edit/object-item-edit-paginated.tsx b/frontend/src/screens/object-item-edit/object-item-edit-paginated.tsx index 3c757fcfe0..e87e096f99 100644 --- a/frontend/src/screens/object-item-edit/object-item-edit-paginated.tsx +++ b/frontend/src/screens/object-item-edit/object-item-edit-paginated.tsx @@ -142,16 +142,13 @@ export default function ObjectItemEditComponent(props: Props) { closeDrawer(); onUpdateComplete(); - setIsLoading(false); - return; + setIsLoading(false); } catch (e) { console.error("Something went wrong while updating the object:", e); - - setIsLoading(false); - - return; } + + setIsLoading(false); } } diff --git a/frontend/src/utils/getMutationDetailsFromFormData.ts b/frontend/src/utils/getMutationDetailsFromFormData.ts index fc150cc30d..fec78f3c65 100644 --- a/frontend/src/utils/getMutationDetailsFromFormData.ts +++ b/frontend/src/utils/getMutationDetailsFromFormData.ts @@ -14,6 +14,11 @@ const getMutationDetailsFromFormData = ( schema.attributes?.forEach((attribute) => { const updatedValue = updatedObject[attribute.name]?.value ?? attribute?.default_value; + if (attribute.read_only) { + // Delete the attribute if it's read-only + delete updatedObject[attribute.name]; + } + if (mode === "update" && existingObject) { const existingValue = existingObject[attribute.name]?.value; From 673f75962a415255484eb9bbb17c27a17b1f8c33 Mon Sep 17 00:00:00 2001 From: pa-lem Date: Thu, 16 Nov 2023 15:54:45 +0100 Subject: [PATCH 055/446] hide read only attributes in forms --- frontend/src/utils/formStructureForCreateEdit.ts | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/frontend/src/utils/formStructureForCreateEdit.ts b/frontend/src/utils/formStructureForCreateEdit.ts index 5025b995d6..c0312fec9a 100644 --- a/frontend/src/utils/formStructureForCreateEdit.ts +++ b/frontend/src/utils/formStructureForCreateEdit.ts @@ -102,6 +102,11 @@ const getFormStructureForCreateEdit = ( })); } + if (attribute.read_only) { + // Hide read-only attributes + return; + } + formFields.push({ name: attribute.name + ".value", kind: attribute.kind as SchemaAttributeType, @@ -136,6 +141,11 @@ const getFormStructureForCreateEdit = ( relationship.kind === "Parent" ) .forEach((relationship) => { + if (relationship.read_only) { + // Hide read-only relationship + return; + } + let options: SelectOption[] = []; const isInherited = generics.find((g) => g.kind === relationship.peer); From d99251ee2c9967d225f213610b9ae18a9f64668d Mon Sep 17 00:00:00 2001 From: Bilal Date: Thu, 16 Nov 2023 16:00:40 +0100 Subject: [PATCH 056/446] self-hosted all mdi icons --- frontend/package-lock.json | 34 ++++++++++++++++++++++++++++++++++ frontend/package.json | 1 + frontend/src/App.tsx | 3 +++ 3 files changed, 38 insertions(+) diff --git a/frontend/package-lock.json b/frontend/package-lock.json index da6932d709..877c4293ae 100644 --- a/frontend/package-lock.json +++ b/frontend/package-lock.json @@ -13,6 +13,8 @@ "@heroicons/react": "^2.0.15", "@hookform/error-message": "^2.0.1", "@iconify-icon/react": "^1.0.8", + "@iconify-icons/mdi": "^1.2.48", + "@iconify-json/mdi": "^1.1.55", "@sentry/react": "^7.45.0", "@sentry/tracing": "^7.45.0", "@tailwindcss/forms": "^0.5.3", @@ -2409,6 +2411,22 @@ "react": ">=16" } }, + "node_modules/@iconify-icons/mdi": { + "version": "1.2.48", + "resolved": "https://registry.npmjs.org/@iconify-icons/mdi/-/mdi-1.2.48.tgz", + "integrity": "sha512-51bfNoRLhYDfxSu0Nyi/uRVq6q/tP4TyEc0vvuNwImrXpxrRJUAWJF2A36CfBkXm3hO9IBlph/CD/XNDJKgG6w==", + "dependencies": { + "@iconify/types": "*" + } + }, + "node_modules/@iconify-json/mdi": { + "version": "1.1.55", + "resolved": "https://registry.npmjs.org/@iconify-json/mdi/-/mdi-1.1.55.tgz", + "integrity": "sha512-ycnFub+EQx+3D/aDCg6iC7sjexOUa5GzxUNIZFFl0Pq7aDxbmhIludoyYnguEO3REyWf9FcOOmvVcQkdtwKHTw==", + "dependencies": { + "@iconify/types": "*" + } + }, "node_modules/@iconify/types": { "version": "2.0.0", "resolved": "https://registry.npmjs.org/@iconify/types/-/types-2.0.0.tgz", @@ -14305,6 +14323,22 @@ "iconify-icon": "^1.0.8" } }, + "@iconify-icons/mdi": { + "version": "1.2.48", + "resolved": "https://registry.npmjs.org/@iconify-icons/mdi/-/mdi-1.2.48.tgz", + "integrity": "sha512-51bfNoRLhYDfxSu0Nyi/uRVq6q/tP4TyEc0vvuNwImrXpxrRJUAWJF2A36CfBkXm3hO9IBlph/CD/XNDJKgG6w==", + "requires": { + "@iconify/types": "*" + } + }, + "@iconify-json/mdi": { + "version": "1.1.55", + "resolved": "https://registry.npmjs.org/@iconify-json/mdi/-/mdi-1.1.55.tgz", + "integrity": "sha512-ycnFub+EQx+3D/aDCg6iC7sjexOUa5GzxUNIZFFl0Pq7aDxbmhIludoyYnguEO3REyWf9FcOOmvVcQkdtwKHTw==", + "requires": { + "@iconify/types": "*" + } + }, "@iconify/types": { "version": "2.0.0", "resolved": "https://registry.npmjs.org/@iconify/types/-/types-2.0.0.tgz", diff --git a/frontend/package.json b/frontend/package.json index 11934cc6e1..8c1c3ac4d1 100644 --- a/frontend/package.json +++ b/frontend/package.json @@ -31,6 +31,7 @@ "@heroicons/react": "^2.0.15", "@hookform/error-message": "^2.0.1", "@iconify-icon/react": "^1.0.8", + "@iconify-json/mdi": "^1.1.55", "@sentry/react": "^7.45.0", "@sentry/tracing": "^7.45.0", "@tailwindcss/forms": "^0.5.3", diff --git a/frontend/src/App.tsx b/frontend/src/App.tsx index c3acbce709..5749ee8485 100644 --- a/frontend/src/App.tsx +++ b/frontend/src/App.tsx @@ -25,6 +25,9 @@ import { schemaKindNameState } from "./state/atoms/schemaKindName.atom"; import "./styles/index.css"; import { sortByOrderWeight } from "./utils/common"; import { fetchUrl } from "./utils/fetch"; +import mdiIcons from "@iconify-json/mdi/icons.json"; +import { addCollection } from "@iconify-icon/react"; +addCollection(mdiIcons); function App() { const [branches] = useAtom(branchesState); From 1c861e98514c3fe01a6a9f24d46fb9f5471a3676 Mon Sep 17 00:00:00 2001 From: Aaron McCarty Date: Thu, 16 Nov 2023 07:06:53 -0800 Subject: [PATCH 057/446] some fixes/updates for tutorial pages (#1413) --- docs/tutorials/getting-started/branches.md | 2 +- .../getting-started/custom-api-endpoint.md | 4 ++-- .../getting-started/git-integration.md | 12 +++++----- .../getting-started/graphql-query.md | 23 ++++++++----------- .../introduction-to-infrahub.md | 2 +- .../getting-started/jinja2-integration.md | 16 ++++++------- .../getting-started/lineage-information.md | 4 ++-- 7 files changed, 29 insertions(+), 34 deletions(-) diff --git a/docs/tutorials/getting-started/branches.md b/docs/tutorials/getting-started/branches.md index 623d9467b0..e3a0577566 100644 --- a/docs/tutorials/getting-started/branches.md +++ b/docs/tutorials/getting-started/branches.md @@ -106,7 +106,7 @@ Execute `invoke demo.cli-git` to start a shell session that will give you access ==- Merge a Branch using GraphQL -Use the GraphQL query below to create a new branch named `cr1234` +Use the GraphQL query below to merge the branch named `cr1234` ```graphql # Endpoint : http://127.0.0.1:8000/graphql/main diff --git a/docs/tutorials/getting-started/custom-api-endpoint.md b/docs/tutorials/getting-started/custom-api-endpoint.md index 1acabf883e..3e1de1ed4b 100644 --- a/docs/tutorials/getting-started/custom-api-endpoint.md +++ b/docs/tutorials/getting-started/custom-api-endpoint.md @@ -4,8 +4,8 @@ label: Custom API Endpoint tags: [tutorial] order: 300 --- -As powerful as Jinja templates are, sometimes it’s both cleaner and simpler to work directly in code. The Infrahub transform endpoint lets you do just that. Where an `RFile` combines a GraphQL query together with a Jinja template, the Transform operation combines a GraphQL query with code. A scenario where use a Transform instead of an `RFile` is when you need to return structured data as opposed to a classic text-based configuration file. -In the example repository `infrahub-demo-edge` we use a Transform to render configuration in the Openconfig format. In this example, we want to generate Openconfig interface data. The URL to target looks like this: +As powerful as Jinja templates are, sometimes it’s both cleaner and simpler to work directly in code. The Infrahub transform endpoint lets you do just that. Where an `RFile` combines a GraphQL query together with a Jinja template, the Transform operation combines a GraphQL query with code. You might use a Transform instead of an `RFile` when you need to return structured data as opposed to a classic text-based configuration file. +In the example repository `infrahub-demo-edge` we use a Transform to render a configuration in the Openconfig format. In this example, we want to generate Openconfig interface data. The URL to target looks like this: http://localhost:8000/api/transform/openconfig/interfaces?device=ord1-edge1&branch=cr1234 diff --git a/docs/tutorials/getting-started/git-integration.md b/docs/tutorials/getting-started/git-integration.md index ba0ef75fd4..cd2bc9795a 100644 --- a/docs/tutorials/getting-started/git-integration.md +++ b/docs/tutorials/getting-started/git-integration.md @@ -5,10 +5,10 @@ tags: [tutorial] order: 500 --- -One of the 3 pillar Infrahub is built on is the idea of having **Unified Version Control for Data and Files** at the same time. +One of the three pillars Infrahub is built on is the idea of having **Unified Version Control for Data and Files** at the same time. The data being stored in the Graph Database and the files in Git. -When integrating a Git repository with Infrahub the Git Agent will ensure that both systems will stay in sync at any time. +When integrating a Git repository with Infrahub, the Git Agent will ensure that both systems will stay in sync at any time. @@ -34,8 +34,8 @@ Once you have created a fork in Github, you'll need a Personal Access Token to a !!! -If you already cloned the repo in the past, ensure there only the main branch is present in Github. -If other branches are present, it's recommanded to delete them for now. +If you already cloned the repo in the past, ensure that only the main branch is present in Github. +If other branches are present, we recommend deleting them for now. !!! ==- How to Delete a branch in Github @@ -79,8 +79,8 @@ mutation { !!!success Validate that everything is correct In the UI, new objects that have been imported from the Git Repository should now be available: -- The repository should be visible under [Objects / Repository](http://localhost:8000/objects/Repository/). If the repository you added doesn't have the commit property populated it means that the initial sync didn't work. Verify the location and credentials. -- 2 Rfile under [Objects / RFile](http://localhost:8000/objects/RFile/) +- The repository should be visible under [Unified Storage / Repository](http://localhost:8000/objects/CoreRepository/). If the repository you added doesn't have the commit property populated it means that the initial sync didn't work. Verify the location and credentials. +- 2 [Rfiles](http://localhost:8000/objects/CoreRFile/) - 5 GraphQL Queries under [Objects / Graphql Query](http://localhost:8000/objects/GraphQLQuery/) !!! diff --git a/docs/tutorials/getting-started/graphql-query.md b/docs/tutorials/getting-started/graphql-query.md index 71d057411b..3a2dedb619 100644 --- a/docs/tutorials/getting-started/graphql-query.md +++ b/docs/tutorials/getting-started/graphql-query.md @@ -10,14 +10,14 @@ The GraphQL interface is accessible at [http://localhost:8000/graphql](http://lo ### Introduction to GraphQL -GraphQL is the main interface to programatically interact with Infrahub. Via the GraphQL interface, it's possible to perform all the standard CRUD operations: Create, Read, Update and Delete any objects in the database. +GraphQL is the main interface to programatically interact with Infrahub. Via the GraphQL interface, it's possible to perform all the standard CRUD operations (Create, Read, Update and Delete) on any objects in the database. -In GraphQL terminology, a `query` reference any read operation and a `mutation` reference any write operation that may change the value of the data. +In GraphQL terminology, a `query` references any read operation and a `mutation` references any write operation that may change the value of the data. Infrahub support both `query` and `mutation` for all objects. One of the main concepts behind GraphQL is the presence of a Schema that defines what type of information we have in the database and how these objects are related to each other, based on this schema, a user can execute queries that will return data. -Unlike a REST API, the format of the response is not fixed in GraphQL, it depends on the query and you get back only that you asked for. +Unlike a REST API, the format of the response is not fixed in GraphQL, it depends on the query and you get back only what you asked for. ### First Query @@ -45,21 +45,16 @@ Query all interfaces and IP addresses for `ord1-edge` ```graphql # GraphQL query with a top level filter # Endpoint : http://127.0.0.1:8000/graphql/main -query { - InfraDevice(name__value: "ord1-edge1") { +query DeviceIPAddresses { + InfraInterfaceL3(device__name__value:"ord1-edge1") { edges { node { - name { - value - } - interfaces { + name { value } + description { value } + ip_addresses { edges { node { - id - name { - value - } - description { + address { value } } diff --git a/docs/tutorials/getting-started/introduction-to-infrahub.md b/docs/tutorials/getting-started/introduction-to-infrahub.md index 3f148ec482..2b2404b5f7 100644 --- a/docs/tutorials/getting-started/introduction-to-infrahub.md +++ b/docs/tutorials/getting-started/introduction-to-infrahub.md @@ -21,5 +21,5 @@ During this tutorial we'll mainly use the Frontend, the `infrahubctl` CLI and Gr | **Git Agent** | Infrahub Agent that manages all content hosted in Git. | -- | | **Git Server** | External Git Server like Github or Gitlab that can host some Git repositories | | | **GraphDB** | Main database where all information in the graph are stored. Neo4j 5.x | -- | -| **Cache** | Cache based on Redis, mainly used to support the reservation of shared resources across all comoponents. | -- | +| **Cache** | Cache based on Redis, mainly used to support the reservation of shared resources across all components. | -- | | **Message Bus** | Message bus based on RabbitMQ to allow all components to interact |-- | diff --git a/docs/tutorials/getting-started/jinja2-integration.md b/docs/tutorials/getting-started/jinja2-integration.md index e7173aa61f..9a2fab3cf5 100644 --- a/docs/tutorials/getting-started/jinja2-integration.md +++ b/docs/tutorials/getting-started/jinja2-integration.md @@ -20,7 +20,7 @@ An `RFile` is an internal concept that represents a Jinja Template coupled with The rendered configuration is available via the REST API under `/api/rfile/` followed by any additional parameters expected in the GraphQL query. -The `rfile` **device_startup** present in the repository, expect the name of the device as a parameter `/api/rfile/?device=`, as an example, below is the URL for couple of devices: +The `rfile` **device_startup** present in the repository, expects the name of the device as a parameter `/api/rfile/?device=`, as an example, below is the URL for couple of devices: - [Config for `ord1-edge1` (/api/rfile/device_startup?device=ord1-edge1)](http://localhost:8000/api/rfile/device_startup?device=ord1-edge1) - [Config for `atl1-edge2` (/api/rfile/device_startup?device=atl1-edge2)](http://localhost:8000/api/rfile/device_startup?device=atl1-edge2) @@ -31,7 +31,7 @@ Next, we'll create a new branch, and make modifications both in the data and in #### 1. Create a new branch `update-ethernet1` -From the frontend, create a new branch named `update-ethernet1` and be sure to uncheck the toggle `is Data Only` in the UI. +From the frontend, create a new branch named `update-ethernet1` and be sure to uncheck the toggle `Is data only` in the UI. ![Create a new branch (not with Data Only)](../../media/tutorial/tutorial-6-git-integration.cy.ts/tutorial_6_branch_creation.png) @@ -40,9 +40,9 @@ From the frontend, create a new branch named `update-ethernet1` and be sure to u Now we'll make a change in the branch `update-ethernet1` that will be reflected in the rendered template, like updating the documentation. 1. Navigate to the device `atl1-edge1` in the frontend and -2. Navigate to the list of its interfaces in the `interfaces` Tab. +2. Navigate to the list of its interfaces in the `Interfaces` Tab. 3. Select the interface `Ethernet1` -4. Edit the interface `Ethernet` and +4. Edit `Interface L3` using the `Edit` button 5. Update its description to `New description in the branch` 6. Save your change @@ -55,7 +55,7 @@ The final step is to modify the Jinja template directly from Github In Github: - Navigate to your clone - Select the new branch in the branch menu dropdown -- Select the file `device_startup_config.tpl.j2` at the root of the repository +- Select the file `templates` / `device_startup_config.tpl.j2` - Edit the file with the `pen` in the top right corner - Delete the lines 77 and 78 (i.e. the last two lines of 'ip prefix-list BOGON-Prefixes') - Commit your changes in the branch `update-ethernet1` directly from github @@ -64,10 +64,10 @@ In Github: !!!success Validate that everything is correct -After making these changes, you should be able to render the RFIle for the branch `update-ethernet1` and see the changes made to the data AND to the schema at the same time at the address -[`/rfile/device_startup?device=ord1-edge1&branch=update-ethernet1`](http://localhost:8000/api/rfile/device_startup?device=ord1-edge1&branch=update-ethernet1) +After making these changes, you should be able to render the RFile for the branch `update-ethernet1` and see the changes made to the data AND to the schema at the same time at the address +[`/rfile/device_startup?device=atl1-edge1&branch=update-ethernet1`](http://localhost:8000/api/rfile/device_startup?device=atl1-edge1&branch=update-ethernet1) !!! #### 4. Merge the Branch `update-ethernet1` -After merging the branch `update-ethernet1`, regenerate the configuration for `ord1-edge1` in `main` and validate that the 2 changes are now available in `main` +After merging the branch `update-ethernet1`, regenerate the configuration for `atl1-edge1` in `main` and validate that the 2 changes are now available in `main` diff --git a/docs/tutorials/getting-started/lineage-information.md b/docs/tutorials/getting-started/lineage-information.md index cd30a4cdb5..d7bbf5f070 100644 --- a/docs/tutorials/getting-started/lineage-information.md +++ b/docs/tutorials/getting-started/lineage-information.md @@ -22,8 +22,8 @@ Currently the list of metadata available is fixed but in the future it will be p The demo dataset that we loaded at the previous step, already has a lot of metadata defined for you to explore. If you navigate to the detailed page of any device you'll be able to see that: 1. The **name** has been defined by the `pop-builder` script and `is_protected` to prevent any further changes -2. The **role** has been defined by the `pop-builder`, is owner by the `Engineering Team` and `is_protected` as well -3. The **description** is neither protected nor does it has a source or a owner defined +2. The **role** has been defined by the `pop-builder`, is owned by the `Engineering Team` and `is_protected` as well +3. The **description** is not protected and does not have a source or an owner defined ![](../../media/tutorial/tutorial-4-data.cy.ts/tutorial_4_metadata.png) From 81a8f8207f9593947afa18b6775dd187a0e3f5bf Mon Sep 17 00:00:00 2001 From: pa-lem Date: Thu, 16 Nov 2023 18:03:42 +0100 Subject: [PATCH 058/446] fix delete relationship many --- .../object-item-details/relationship-details-paginated.tsx | 5 +++-- .../object-item-details/relationships-details-paginated.tsx | 4 ++-- 2 files changed, 5 insertions(+), 4 deletions(-) diff --git a/frontend/src/screens/object-item-details/relationship-details-paginated.tsx b/frontend/src/screens/object-item-details/relationship-details-paginated.tsx index 0cc6228587..33ed522f0c 100644 --- a/frontend/src/screens/object-item-details/relationship-details-paginated.tsx +++ b/frontend/src/screens/object-item-details/relationship-details-paginated.tsx @@ -134,7 +134,7 @@ export default function RelationshipDetails(props: iRelationDetailsProps) { const handleDeleteRelationship = async (id: string) => { if (onDeleteRelationship) { - await onDeleteRelationship(id); + await onDeleteRelationship(relationshipSchema.name, id); setShowAddDrawer(false); @@ -148,7 +148,7 @@ export default function RelationshipDetails(props: iRelationDetailsProps) { .filter((item: any) => item.id !== id); const mutationString = updateObjectWithId({ - name: schema.name, + kind: schema.kind, data: stringifyWithoutQuotes({ id: objectid, [relationshipSchema.name]: newList, @@ -615,6 +615,7 @@ export default function RelationshipDetails(props: iRelationDetailsProps) { }} /> + {relatedRowToDelete && ( { + const handleDeleteRelationship = async (name: string, id: string) => { const mutationString = removeRelationship({ data: stringifyWithoutQuotes({ id: objectid, - name: "members", + name, nodes: [ { id, From 946586ce024bdea76a4e346b859d2072828c786e Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Thu, 16 Nov 2023 20:34:26 +0100 Subject: [PATCH 059/446] Move back end2end test to previous CI stage Test as it's taking so long to run the entire pipeline now. --- .github/workflows/ci.yml | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 6c53a110d1..a26477a840 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -388,7 +388,11 @@ jobs: # run: invoke demo.status E2E-testing-neo4j: - needs: ["frontend-tests", "backend-tests-default", "python-sdk-tests"] + needs: + - javascript-lint + - files-changed + - yaml-lint + - python-lint if: | always() && !cancelled() && !contains(needs.*.result, 'failure') && From effea47d8b75058c76a3c7cdc357ced0a9400c00 Mon Sep 17 00:00:00 2001 From: Aaron McCarty Date: Thu, 16 Nov 2023 11:53:33 -0800 Subject: [PATCH 060/446] improve diff data query validation (#1412) * improve validation for getting diff data, and timestamp in general * add unit tests for timestamp and Diff validation errors * use new query validation model, remove exc handling blocks from API code * comment to ignore self argument error * mypy fixes, small refactoring --- backend/infrahub/api/diff/__init__.py | 3 + backend/infrahub/api/{ => diff}/diff.py | 16 +++- .../infrahub/api/diff/validation_models.py | 37 +++++++++ backend/infrahub/api/exception_handlers.py | 22 ++++++ backend/infrahub/api/exceptions.py | 8 ++ backend/infrahub/core/branch.py | 13 ++- backend/infrahub/exceptions.py | 22 ++++++ backend/infrahub/server.py | 29 ++++--- backend/tests/unit/api/diff/__init__.py | 0 .../api/diff/test_diff_query_validation.py | 42 ++++++++++ backend/tests/unit/api/test_15_diff.py | 2 +- .../unit/api/test_api_exception_handler.py | 79 +++++++++++++++++++ backend/tests/unit/core/test_diff_init.py | 46 +++++++++++ python_sdk/infrahub_sdk/timestamp.py | 8 +- python_sdk/tests/unit/sdk/test_timestamp.py | 8 +- 15 files changed, 315 insertions(+), 20 deletions(-) create mode 100644 backend/infrahub/api/diff/__init__.py rename backend/infrahub/api/{ => diff}/diff.py (98%) create mode 100644 backend/infrahub/api/diff/validation_models.py create mode 100644 backend/infrahub/api/exception_handlers.py create mode 100644 backend/infrahub/api/exceptions.py create mode 100644 backend/tests/unit/api/diff/__init__.py create mode 100644 backend/tests/unit/api/diff/test_diff_query_validation.py create mode 100644 backend/tests/unit/api/test_api_exception_handler.py create mode 100644 backend/tests/unit/core/test_diff_init.py diff --git a/backend/infrahub/api/diff/__init__.py b/backend/infrahub/api/diff/__init__.py new file mode 100644 index 0000000000..25b37e6879 --- /dev/null +++ b/backend/infrahub/api/diff/__init__.py @@ -0,0 +1,3 @@ +from .diff import router + +__all__ = ["router"] diff --git a/backend/infrahub/api/diff.py b/backend/infrahub/api/diff/diff.py similarity index 98% rename from backend/infrahub/api/diff.py rename to backend/infrahub/api/diff/diff.py index f1702812dd..2bef4825a2 100644 --- a/backend/infrahub/api/diff.py +++ b/backend/infrahub/api/diff/diff.py @@ -28,6 +28,8 @@ from infrahub.core.schema_manager import INTERNAL_SCHEMA_NODE_KINDS from infrahub.database import InfrahubDatabase # noqa: TCH001 +from .validation_models import DiffQueryValidated + if TYPE_CHECKING: from infrahub.message_bus.rpc import InfrahubRpcClient @@ -931,8 +933,13 @@ async def get_diff_data( branch_only: bool = True, _: str = Depends(get_current_user), ) -> BranchDiff: + query = DiffQueryValidated(branch=branch, time_from=time_from, time_to=time_to, branch_only=branch_only) diff = await branch.diff( - db=db, diff_from=time_from, diff_to=time_to, branch_only=branch_only, namespaces_exclude=["Schema"] + db=db, + diff_from=query.time_from, + diff_to=query.time_to, + branch_only=query.branch_only, + namespaces_exclude=["Schema"], ) schema = registry.schema.get_full(branch=branch) diff_payload = DiffPayload(db=db, diff=diff, kinds_to_include=list(schema.keys())) @@ -948,8 +955,13 @@ async def get_diff_schema( branch_only: bool = True, _: str = Depends(get_current_user), ) -> BranchDiff: + query = DiffQueryValidated(branch=branch, time_from=time_from, time_to=time_to, branch_only=branch_only) diff = await branch.diff( - db=db, diff_from=time_from, diff_to=time_to, branch_only=branch_only, kinds_include=INTERNAL_SCHEMA_NODE_KINDS + db=db, + diff_from=query.time_from, + diff_to=query.time_to, + branch_only=query.branch_only, + kinds_include=INTERNAL_SCHEMA_NODE_KINDS, ) diff_payload = DiffPayload(db=db, diff=diff) return await diff_payload.generate_diff_payload() diff --git a/backend/infrahub/api/diff/validation_models.py b/backend/infrahub/api/diff/validation_models.py new file mode 100644 index 0000000000..d75fc0457e --- /dev/null +++ b/backend/infrahub/api/diff/validation_models.py @@ -0,0 +1,37 @@ +from typing import Any, Dict, Optional + +from pydantic import BaseModel, root_validator, validator + +from infrahub.core.branch import Branch +from infrahub.core.timestamp import Timestamp + + +class DiffQueryValidated(BaseModel): + branch: Branch + time_from: Optional[str] + time_to: Optional[str] + branch_only: bool + + class Config: + arbitrary_types_allowed = True + + @validator("time_from", "time_to", pre=True) + @classmethod + def validate_time(cls, value: Optional[str]) -> Optional[str]: + if not value: + return None + Timestamp(value) + return value + + @root_validator(skip_on_failure=True) + @classmethod + def validate_time_from_if_required(cls, values: Dict[str, Any]) -> Dict[str, Any]: + branch: Optional[Branch] = values.get("branch") + time_from: Optional[Timestamp] = values.get("time_from") + if getattr(branch, "is_default", False) and not time_from: + branch_name = getattr(branch, "name", "") + raise ValueError(f"time_from is mandatory when diffing on the default branch `{branch_name}`.") + time_to: Optional[Timestamp] = values.get("time_to") + if time_to and time_from and time_to < time_from: + raise ValueError("time_from and time_to are not a valid time range") + return values diff --git a/backend/infrahub/api/exception_handlers.py b/backend/infrahub/api/exception_handlers.py new file mode 100644 index 0000000000..3ae3579eea --- /dev/null +++ b/backend/infrahub/api/exception_handlers.py @@ -0,0 +1,22 @@ +from fastapi.responses import JSONResponse +from pydantic import ValidationError + +from infrahub.exceptions import Error + + +async def generic_api_exception_handler(_, exc: Exception, http_code: int = 500) -> JSONResponse: + """Generic API Exception handler.""" + if isinstance(exc, Error): + if exc.HTTP_CODE: + http_code = exc.HTTP_CODE + messages = [str(exc.message) if exc.message else exc.DESCRIPTION] + elif isinstance(exc, ValidationError): + messages = [ed["msg"] for ed in exc.errors()] + else: + messages = [str(exc)] + error_dict = { + "data": None, + "errors": [{"message": message, "extensions": {"code": http_code}} for message in messages], + } + + return JSONResponse(status_code=http_code, content=error_dict) diff --git a/backend/infrahub/api/exceptions.py b/backend/infrahub/api/exceptions.py new file mode 100644 index 0000000000..3a77b2161e --- /dev/null +++ b/backend/infrahub/api/exceptions.py @@ -0,0 +1,8 @@ +from infrahub.exceptions import Error + + +class QueryValidationError(Error): + HTTP_CODE = 400 + + def __init__(self, message: str): + self.message = message diff --git a/backend/infrahub/core/branch.py b/backend/infrahub/core/branch.py index c212c809a4..7612d25b9d 100644 --- a/backend/infrahub/core/branch.py +++ b/backend/infrahub/core/branch.py @@ -39,7 +39,12 @@ from infrahub.core.registry import get_branch, registry from infrahub.core.timestamp import Timestamp from infrahub.core.utils import add_relationship, update_relationships_to -from infrahub.exceptions import BranchNotFound, ValidationError +from infrahub.exceptions import ( + BranchNotFound, + DiffFromRequiredOnDefaultBranchError, + DiffRangeValidationError, + ValidationError, +) from infrahub.message_bus import messages from infrahub.message_bus.responses import DiffNamesResponse @@ -1035,7 +1040,9 @@ def __init__( self.branch_support = branch_support or [BranchSupportType.AWARE] if not diff_from and self.branch.is_default: - raise ValueError(f"diff_from is mandatory when diffing on the default branch `{self.branch.name}`.") + raise DiffFromRequiredOnDefaultBranchError( + f"diff_from is mandatory when diffing on the default branch `{self.branch.name}`." + ) # If diff from hasn't been provided, we'll use the creation of the branch as the starting point if diff_from: @@ -1047,7 +1054,7 @@ def __init__( self.diff_to = Timestamp(diff_to) if self.diff_to < self.diff_from: - raise ValueError("diff_to must be later than diff_from") + raise DiffRangeValidationError("diff_to must be later than diff_from") # Results organized by Branch self._results: Dict[str, dict] = defaultdict( diff --git a/backend/infrahub/exceptions.py b/backend/infrahub/exceptions.py index 4d90ccf4a2..d1614e255d 100644 --- a/backend/infrahub/exceptions.py +++ b/backend/infrahub/exceptions.py @@ -185,6 +185,13 @@ def __str__(self): """ +class QueryValidationError(Error): + HTTP_CODE = 400 + + def __init__(self, message: str): + self.message = message + + class ValidationError(Error): def __init__(self, input_value): self.message: Optional[str] = None @@ -215,3 +222,18 @@ def __str__(self): return ", ".join([f"{message} at {location}" for location, message in self.messages.items()]) return f"{self.message} at {self.location or ''}" + + +class DiffError(Error): + HTTP_CODE = 400 + + def __init__(self, message: str): + self.message = message + + +class DiffRangeValidationError(DiffError): + ... + + +class DiffFromRequiredOnDefaultBranchError(DiffError): + ... diff --git a/backend/infrahub/server.py b/backend/infrahub/server.py index 0f7c7ba989..6938aaf0b6 100644 --- a/backend/infrahub/server.py +++ b/backend/infrahub/server.py @@ -2,22 +2,25 @@ import logging import os import time +from functools import partial from typing import Awaitable, Callable from asgi_correlation_id import CorrelationIdMiddleware from asgi_correlation_id.context import correlation_id from fastapi import FastAPI, Request, Response from fastapi.logger import logger -from fastapi.responses import JSONResponse from fastapi.staticfiles import StaticFiles from fastapi.templating import Jinja2Templates +from infrahub_sdk.timestamp import TimestampFormatError from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor +from pydantic import ValidationError from starlette_exporter import PrometheusMiddleware, handle_metrics import infrahub.config as config from infrahub import __version__ from infrahub.api import router as api from infrahub.api.background import BackgroundRunner +from infrahub.api.exception_handlers import generic_api_exception_handler from infrahub.core.initialization import initialization from infrahub.database import InfrahubDatabase, InfrahubDatabaseMode, get_db from infrahub.exceptions import Error @@ -129,18 +132,18 @@ async def add_process_time_header(request: Request, call_next): return response -app.add_middleware(CorrelationIdMiddleware) - - -@app.exception_handler(Error) -async def api_exception_handler_base_infrahub_error(_: Request, exc: Error) -> JSONResponse: - """Generic API Exception handler.""" - - error = exc.api_response() - add_span_exception(exc) - return JSONResponse(status_code=exc.HTTP_CODE, content=error) +@app.middleware("http") +async def add_telemetry_span_exception( + request: Request, call_next: Callable[[Request], Awaitable[Response]] +) -> Response: + try: + return await call_next(request) + except Exception as exc: + add_span_exception(exc) + raise +app.add_middleware(CorrelationIdMiddleware) app.add_middleware( PrometheusMiddleware, app_name="infrahub", @@ -151,6 +154,10 @@ async def api_exception_handler_base_infrahub_error(_: Request, exc: Error) -> J ) app.add_middleware(InfrahubCORSMiddleware) +app.add_exception_handler(Error, generic_api_exception_handler) +app.add_exception_handler(TimestampFormatError, partial(generic_api_exception_handler, http_code=400)) +app.add_exception_handler(ValidationError, partial(generic_api_exception_handler, http_code=400)) + app.add_route(path="/metrics", route=handle_metrics) app.add_route(path="/graphql", route=InfrahubGraphQLApp(playground=True), methods=["GET", "POST", "OPTIONS"]) app.add_route( diff --git a/backend/tests/unit/api/diff/__init__.py b/backend/tests/unit/api/diff/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/backend/tests/unit/api/diff/test_diff_query_validation.py b/backend/tests/unit/api/diff/test_diff_query_validation.py new file mode 100644 index 0000000000..14c44c4428 --- /dev/null +++ b/backend/tests/unit/api/diff/test_diff_query_validation.py @@ -0,0 +1,42 @@ +import pytest +from pydantic import ValidationError + +from infrahub.api.diff.validation_models import DiffQueryValidated +from infrahub.core.branch import Branch + + +class TestDiffQueryValidation: + def setup_method(self): + self.branch = Branch(name="abc") + self.time_start_str = "2023-06-11" + self.time_end_str = "2023-06-13" + + def test_valid_query(self): + query = DiffQueryValidated( + branch=self.branch, time_from=self.time_start_str, time_to=self.time_end_str, branch_only=True + ) + + assert query.branch == self.branch + assert query.time_from == self.time_start_str + assert query.time_to == self.time_end_str + assert query.branch_only is True + + def test_invalid_time_from(self): + with pytest.raises(ValidationError): + DiffQueryValidated(branch=self.branch, time_from="notatime") + + def test_invalid_time_to(self): + with pytest.raises(ValidationError): + DiffQueryValidated(branch=self.branch, time_to="notatime") + + def test_invalid_time_range(self): + with pytest.raises(ValidationError, match="time_from and time_to are not a valid time range"): + DiffQueryValidated( + branch=self.branch, time_from=self.time_end_str, time_to=self.time_start_str, branch_only=True + ) + + def test_time_from_required_for_default_branch(self): + self.branch.is_default = True + + with pytest.raises(ValidationError, match="time_from is mandatory when diffing on the default branch `abc`."): + DiffQueryValidated(branch=self.branch, branch_only=True) diff --git a/backend/tests/unit/api/test_15_diff.py b/backend/tests/unit/api/test_15_diff.py index 0a9604797f..32c0af60bb 100644 --- a/backend/tests/unit/api/test_15_diff.py +++ b/backend/tests/unit/api/test_15_diff.py @@ -1,7 +1,7 @@ import pytest from deepdiff import DeepDiff -from infrahub.api.diff import get_display_labels, get_display_labels_per_kind +from infrahub.api.diff.diff import get_display_labels, get_display_labels_per_kind from infrahub.core.initialization import create_branch from infrahub.core.manager import NodeManager from infrahub.core.node import Node diff --git a/backend/tests/unit/api/test_api_exception_handler.py b/backend/tests/unit/api/test_api_exception_handler.py new file mode 100644 index 0000000000..bdb366dd83 --- /dev/null +++ b/backend/tests/unit/api/test_api_exception_handler.py @@ -0,0 +1,79 @@ +from json import loads +from typing import Optional + +from pydantic import BaseModel, ValidationError, root_validator, validator + +from infrahub.api.exception_handlers import generic_api_exception_handler +from infrahub.exceptions import Error + + +class ModelForTesting(BaseModel): + field_1: Optional[str] + + @validator("field_1", always=True) + def always_fail(cls, *args, **kwargs): + raise ValueError("this is the error message") + + @root_validator() + def always_fail_2(cls, values): + raise ValueError("another error message") + + +class MockError(Error): + HTTP_CODE = 418 + DESCRIPTION = "the teapot error" + + def __init__(self, message: Optional[str]): + self.message = message + + +class TestAPIExceptionHandler: + def setup_method(self): + self.error_message = "this is the error message" + + async def test_plain_exception_error(self): + exception = ValueError(self.error_message) + + error_response = await generic_api_exception_handler(None, exception) + + error_dict = loads(error_response.body.decode()) + assert error_dict["errors"] == [{"message": self.error_message, "extensions": {"code": 500}}] + + async def test_pydantic_validation_error(self): + error_message_2 = "another error message" + exception = None + try: + ModelForTesting(field_1="abc") + except ValidationError as exc: + exception = exc + + error_response = await generic_api_exception_handler(None, exception, http_code=400) + + error_dict = loads(error_response.body.decode()) + assert {"message": self.error_message, "extensions": {"code": 400}} in error_dict["errors"] + assert {"message": error_message_2, "extensions": {"code": 400}} in error_dict["errors"] + assert len(error_dict) == 2 + + async def test_infrahub_api_error(self): + exception = MockError(self.error_message) + + error_response = await generic_api_exception_handler(None, exception) + + error_dict = loads(error_response.body.decode()) + assert error_dict["errors"] == [{"message": self.error_message, "extensions": {"code": 418}}] + + async def test_infrahub_api_error_default_message(self): + exception = MockError(None) + + error_response = await generic_api_exception_handler(None, exception) + + error_dict = loads(error_response.body.decode()) + assert error_dict["errors"] == [{"message": "the teapot error", "extensions": {"code": 418}}] + + async def test_infrahub_api_error_code_override(self): + exception = MockError(None) + + error_response = await generic_api_exception_handler(None, exception, http_code=500) + + error_dict = loads(error_response.body.decode()) + assert error_dict["errors"] == [{"message": "the teapot error", "extensions": {"code": 418}}] diff --git a/backend/tests/unit/core/test_diff_init.py b/backend/tests/unit/core/test_diff_init.py new file mode 100644 index 0000000000..c1e61ec2fc --- /dev/null +++ b/backend/tests/unit/core/test_diff_init.py @@ -0,0 +1,46 @@ +from unittest.mock import AsyncMock, MagicMock + +import pytest + +from infrahub.core.branch import Branch, Diff +from infrahub.core.timestamp import Timestamp +from infrahub.exceptions import DiffFromRequiredOnDefaultBranchError, DiffRangeValidationError + + +class TestDiffInit: + def setup_method(self): + self.db = MagicMock() + self.origin_branch = Branch(name="origin") + self.created_at_str = "2023-11-01" + self.created_at_timestamp = Timestamp(self.created_at_str) + self.branch = AsyncMock(spec=Branch) + self.branch.name = "branch" + self.branch.is_default = False + self.branch.created_at = self.created_at_str + self.branch.get_origin_branch.return_value = self.origin_branch + + async def __call_system_under_test(self, branch, **kwargs): + return await Diff.init(self.db, branch, **kwargs) + + async def test_diff_from_required_for_default_branch(self): + self.branch.is_default = True + + with pytest.raises(DiffFromRequiredOnDefaultBranchError): + await self.__call_system_under_test(self.branch) + + async def test_diff_to_cannot_precede_diff_from(self): + bad_diff_to = "2023-10-31" + + with pytest.raises(DiffRangeValidationError): + await self.__call_system_under_test(self.branch, diff_to=bad_diff_to) + + async def test_diff_from_default_is_set(self): + diff_to_str = "2023-11-15" + + diff = await self.__call_system_under_test(self.branch, diff_to=diff_to_str) + + self.branch.get_origin_branch.assert_awaited_once_with(db=self.db) + assert diff.branch == self.branch + assert diff.origin_branch == self.origin_branch + assert diff.diff_from == self.created_at_timestamp + assert diff.diff_to == Timestamp(diff_to_str) diff --git a/python_sdk/infrahub_sdk/timestamp.py b/python_sdk/infrahub_sdk/timestamp.py index 1119eb0472..989e3752b5 100644 --- a/python_sdk/infrahub_sdk/timestamp.py +++ b/python_sdk/infrahub_sdk/timestamp.py @@ -13,6 +13,10 @@ } +class TimestampFormatError(ValueError): + ... + + class Timestamp: def __init__(self, value: Optional[Union[str, DateTime, Timestamp]] = None): if value and isinstance(value, DateTime): @@ -30,7 +34,7 @@ def _parse_string(cls, value: str) -> DateTime: parsed_date = pendulum.parse(value) if isinstance(parsed_date, DateTime): return parsed_date - except pendulum.parsing.exceptions.ParserError: + except (pendulum.parsing.exceptions.ParserError, ValueError): pass params = {} @@ -40,7 +44,7 @@ def _parse_string(cls, value: str) -> DateTime: params[key] = int(match.group(1)) if not params: - raise ValueError(f"Invalid time format for {value}") + raise TimestampFormatError(f"Invalid time format for {value}") return DateTime.now(tz="UTC").subtract(**params) diff --git a/python_sdk/tests/unit/sdk/test_timestamp.py b/python_sdk/tests/unit/sdk/test_timestamp.py index 12c7aa0627..bcdf18a776 100644 --- a/python_sdk/tests/unit/sdk/test_timestamp.py +++ b/python_sdk/tests/unit/sdk/test_timestamp.py @@ -1,7 +1,7 @@ import pendulum import pytest -from infrahub_sdk.timestamp import Timestamp +from infrahub_sdk.timestamp import Timestamp, TimestampFormatError def test_init_empty(): @@ -49,3 +49,9 @@ def test_compare(): assert t11 <= t12 assert t11 >= t12 assert t11 == t12 + + +@pytest.mark.parametrize("invalid_str", ["blurple", "1122334455667788", "2023-45-99"]) +def test_invalid_raises_correct_error(invalid_str): + with pytest.raises(TimestampFormatError): + Timestamp(invalid_str) From bf9ec5a55c20c0000c8e2c86b179c0a56bf14878 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Thu, 16 Nov 2023 21:13:09 +0100 Subject: [PATCH 061/446] Changes to SDK recorder --- python_sdk/infrahub_sdk/client.py | 32 +++++++++++++++++++++++------ python_sdk/infrahub_sdk/config.py | 10 +++++---- python_sdk/infrahub_sdk/recorder.py | 14 ++++++++++++- 3 files changed, 45 insertions(+), 11 deletions(-) diff --git a/python_sdk/infrahub_sdk/client.py b/python_sdk/infrahub_sdk/client.py index 6aae7d8fa7..8fd0a1089a 100644 --- a/python_sdk/infrahub_sdk/client.py +++ b/python_sdk/infrahub_sdk/client.py @@ -84,8 +84,6 @@ def _initialize(self) -> None: """Sets the properties for each version of the client""" def _record(self, response: httpx.Response) -> None: - if not self.config.custom_recorder: - return self.config.custom_recorder.record(response) @@ -98,7 +96,7 @@ def _initialize(self) -> None: self.object_store = ObjectStore(self) self.store = NodeStore() self.concurrent_execution_limit = asyncio.Semaphore(self.max_concurrent_execution) - self._request: AsyncRequester = self.config.requester or self._default_request_method + self._request_method: AsyncRequester = self.config.requester or self._default_request_method @classmethod async def init(cls, *args: Any, **kwargs: Any) -> InfrahubClient: @@ -420,6 +418,18 @@ async def _get(self, url: str, headers: Optional[dict] = None, timeout: Optional timeout=timeout or self.default_timeout, ) + async def _request( + self, + url: str, + method: HTTPMethod, + headers: Dict[str, Any], + timeout: int, + payload: Optional[Dict] = None, + ) -> httpx.Response: + response = await self._request_method(url=url, method=method, headers=headers, timeout=timeout, payload=payload) + self._record(response) + return response + async def _default_request_method( self, url: str, @@ -445,7 +455,6 @@ async def _default_request_method( except httpx.ReadTimeout as exc: raise ServerNotResponsiveError(url=url, timeout=timeout) from exc - self._record(response) return response async def login(self, refresh: bool = False) -> None: @@ -568,7 +577,7 @@ def _initialize(self) -> None: self.branch = InfrahubBranchManagerSync(self) self.object_store = ObjectStoreSync(self) self.store = NodeStoreSync() - self._request: SyncRequester = self.config.sync_requester or self._default_request_method + self._request_method: SyncRequester = self.config.sync_requester or self._default_request_method @classmethod def init(cls, *args: Any, **kwargs: Any) -> InfrahubClientSync: @@ -919,6 +928,18 @@ def _post( timeout=timeout or self.default_timeout, ) + def _request( + self, + url: str, + method: HTTPMethod, + headers: Dict[str, Any], + timeout: int, + payload: Optional[Dict] = None, + ) -> httpx.Response: + response = self._request_method(url=url, method=method, headers=headers, timeout=timeout, payload=payload) + self._record(response) + return response + def _default_request_method( self, url: str, @@ -944,7 +965,6 @@ def _default_request_method( except httpx.ReadTimeout as exc: raise ServerNotResponsiveError(url=url, timeout=timeout) from exc - self._record(response) return response def login(self, refresh: bool = False) -> None: diff --git a/python_sdk/infrahub_sdk/config.py b/python_sdk/infrahub_sdk/config.py index ddcc319956..7ca9b1985e 100644 --- a/python_sdk/infrahub_sdk/config.py +++ b/python_sdk/infrahub_sdk/config.py @@ -3,7 +3,7 @@ from pydantic import BaseSettings, Field, root_validator, validator from infrahub_sdk.playback import JSONPlayback -from infrahub_sdk.recorder import JSONRecorder, Recorder, RecorderType +from infrahub_sdk.recorder import JSONRecorder, NoRecorder, Recorder, RecorderType from infrahub_sdk.types import AsyncRequester, RequesterTransport, SyncRequester from infrahub_sdk.utils import is_valid_url @@ -20,8 +20,8 @@ class Config(BaseSettings): default=RecorderType.NONE, description="Select builtin recorder for later replay.", ) - custom_recorder: Optional[Recorder] = Field( - default=None, + custom_recorder: Recorder = Field( + default_factory=NoRecorder.default, description="Provides a way to record responses from the Infrahub API", ) requester: Optional[AsyncRequester] = None @@ -49,7 +49,9 @@ def validate_credentials_input(cls, values: Dict[str, Any]) -> Dict[str, Any]: @root_validator(pre=True) @classmethod def set_custom_recorder(cls, values: Dict[str, Any]) -> Dict[str, Any]: - if values.get("recorder") == RecorderType.JSON and "custom_recorder" not in values: + if values.get("recorder") == RecorderType.NONE and "custom_recorder" not in values: + values["custom_recorder"] = NoRecorder() + elif values.get("recorder") == RecorderType.JSON and "custom_recorder" not in values: values["custom_recorder"] = JSONRecorder() return values diff --git a/python_sdk/infrahub_sdk/recorder.py b/python_sdk/infrahub_sdk/recorder.py index 393575cd7f..d766c01011 100644 --- a/python_sdk/infrahub_sdk/recorder.py +++ b/python_sdk/infrahub_sdk/recorder.py @@ -1,3 +1,5 @@ +from __future__ import annotations + import enum import json from typing import Protocol, runtime_checkable @@ -16,7 +18,17 @@ class RecorderType(str, enum.Enum): @runtime_checkable class Recorder(Protocol): def record(self, response: httpx.Response) -> None: - ... + """Record the response from Infrahub""" + + +class NoRecorder: + @staticmethod + def record(response: httpx.Response) -> None: + """The NoRecorder just silently returns""" + + @classmethod + def default(cls) -> NoRecorder: + return cls() class JSONRecorder(BaseSettings): From d7bb8360e6e223bfcea3ca7afc5c6ac260b29bb4 Mon Sep 17 00:00:00 2001 From: Mark Michon Date: Thu, 16 Nov 2023 17:02:41 -0800 Subject: [PATCH 062/446] docs(fix): update schema template doc paths --- backend/infrahub/cli/doc.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/backend/infrahub/cli/doc.py b/backend/infrahub/cli/doc.py index cf2207f876..ae6f6e8e28 100644 --- a/backend/infrahub/cli/doc.py +++ b/backend/infrahub/cli/doc.py @@ -20,8 +20,8 @@ def generate_schema() -> None: here = os.path.abspath(os.path.dirname(__file__)) for schema_name in schemas_to_generate: - template_file = os.path.join(here, f"{DOCUMENTATION_DIRECTORY}/schema/{schema_name}.j2") - output_file = os.path.join(here, f"{DOCUMENTATION_DIRECTORY}/schema/{schema_name}.md") + template_file = os.path.join(here, f"{DOCUMENTATION_DIRECTORY}/reference/schema/{schema_name}.j2") + output_file = os.path.join(here, f"{DOCUMENTATION_DIRECTORY}/reference/schema/{schema_name}.md") if not os.path.exists(template_file): print(f"Unable to find the template file at {template_file}") raise typer.Exit(1) From 3a346255727c2e8eb9cb172d1431a54a6e6f5689 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 03:54:02 +0100 Subject: [PATCH 063/446] Fix Ci file --- .github/workflows/ci.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 79c1730f80..2ee4cc817a 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -399,7 +399,7 @@ jobs: # run: invoke demo.status E2E-testing-neo4j: - needs: ["frontend-tests", "backend-tests-default", "python-sdk-tests"] + needs: ["frontend-tests", "backend-tests-default", "python-sdk-integration-tests"] if: | always() && !cancelled() && !contains(needs.*.result, 'failure') && From fb9a717d599b6e9c1d3409a43be9a6c620fb4606 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 04:08:59 +0100 Subject: [PATCH 064/446] Fix cmd to set pydantic version --- .github/workflows/ci.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 2ee4cc817a..dfce9d98a9 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -138,7 +138,7 @@ jobs: pipx install poetry pip install invoke toml - name: Set Version of Pydantic - run: poetry install pydantic@${{ matrix.pydantic-version }} + run: poetry add pydantic@${{ matrix.pydantic-version }} working-directory: python_sdk/ - name: "Build Test Image" run: "invoke test.build" From 350ada2b168d2f451861ec812b23754126be348f Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 04:55:27 +0100 Subject: [PATCH 065/446] ignore[attr-defined] for pydantic --- python_sdk/infrahub_ctl/config.py | 4 ++-- python_sdk/infrahub_sdk/branch.py | 2 +- python_sdk/infrahub_sdk/schema.py | 2 +- 3 files changed, 4 insertions(+), 4 deletions(-) diff --git a/python_sdk/infrahub_ctl/config.py b/python_sdk/infrahub_ctl/config.py index 035d68f124..3691d43f5b 100644 --- a/python_sdk/infrahub_ctl/config.py +++ b/python_sdk/infrahub_ctl/config.py @@ -6,9 +6,9 @@ import typer try: - from pydantic import v1 as pydantic + from pydantic import v1 as pydantic # type: ignore[attr-defined] except ImportError: - import pydantic + import pydantic # type: ignore[no-redef] DEFAULT_CONFIG_FILE = "infrahubctl.toml" ENVVAR_CONFIG_FILE = "INFRAHUBCTL_CONFIG" diff --git a/python_sdk/infrahub_sdk/branch.py b/python_sdk/infrahub_sdk/branch.py index 2596807b15..3ee5dcb5ea 100644 --- a/python_sdk/infrahub_sdk/branch.py +++ b/python_sdk/infrahub_sdk/branch.py @@ -3,7 +3,7 @@ from typing import TYPE_CHECKING, Any, Dict, Optional, Union try: - from pydantic import v1 as pydantic + from pydantic import v1 as pydantic # type: ignore[attr-defined] except ImportError: import pydantic # type: ignore[no-redef] diff --git a/python_sdk/infrahub_sdk/schema.py b/python_sdk/infrahub_sdk/schema.py index b21871eb77..30b1b6a7c2 100644 --- a/python_sdk/infrahub_sdk/schema.py +++ b/python_sdk/infrahub_sdk/schema.py @@ -15,7 +15,7 @@ ) try: - from pydantic import v1 as pydantic + from pydantic import v1 as pydantic # type: ignore[attr-defined] except ImportError: import pydantic # type: ignore[no-redef] From 23d8d883c41f2c045189e6b6fc55da863fb24b23 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 05:04:34 +0100 Subject: [PATCH 066/446] ignore[attr-defined] for pyantic cont. --- python_sdk/infrahub_ctl/cli.py | 4 ++-- python_sdk/infrahub_sdk/config.py | 2 +- python_sdk/infrahub_sdk/playback.py | 2 +- python_sdk/infrahub_sdk/recorder.py | 2 +- 4 files changed, 5 insertions(+), 5 deletions(-) diff --git a/python_sdk/infrahub_ctl/cli.py b/python_sdk/infrahub_ctl/cli.py index 40759eb635..242d69d518 100644 --- a/python_sdk/infrahub_ctl/cli.py +++ b/python_sdk/infrahub_ctl/cli.py @@ -11,9 +11,9 @@ import typer try: - from pydantic import v1 as pydantic + from pydantic import v1 as pydantic # type: ignore[attr-defined] except ImportError: - import pydantic + import pydantic # type: ignore[no-redef] from rich.console import Console from rich.logging import RichHandler diff --git a/python_sdk/infrahub_sdk/config.py b/python_sdk/infrahub_sdk/config.py index 1703923203..864d9ee604 100644 --- a/python_sdk/infrahub_sdk/config.py +++ b/python_sdk/infrahub_sdk/config.py @@ -1,7 +1,7 @@ from typing import Any, Dict, Optional try: - from pydantic import v1 as pydantic + from pydantic import v1 as pydantic # type: ignore[attr-defined] except ImportError: import pydantic # type: ignore[no-redef] diff --git a/python_sdk/infrahub_sdk/playback.py b/python_sdk/infrahub_sdk/playback.py index 5176ba2fd7..96fcf51a9d 100644 --- a/python_sdk/infrahub_sdk/playback.py +++ b/python_sdk/infrahub_sdk/playback.py @@ -4,7 +4,7 @@ import httpx try: - from pydantic import v1 as pydantic + from pydantic import v1 as pydantic # type: ignore[attr-defined] except ImportError: import pydantic # type: ignore[no-redef] diff --git a/python_sdk/infrahub_sdk/recorder.py b/python_sdk/infrahub_sdk/recorder.py index 89d225fad8..6af6eacc9c 100644 --- a/python_sdk/infrahub_sdk/recorder.py +++ b/python_sdk/infrahub_sdk/recorder.py @@ -5,7 +5,7 @@ import httpx try: - from pydantic import v1 as pydantic + from pydantic import v1 as pydantic # type: ignore[attr-defined] except ImportError: import pydantic # type: ignore[no-redef] From a3c3af8ef7c23a5f31ac421ad86796e317252289 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 05:17:57 +0100 Subject: [PATCH 067/446] Execute unit tests outside of docker --- .github/workflows/ci.yml | 12 +++++------- 1 file changed, 5 insertions(+), 7 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index dfce9d98a9..ebee22a38c 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -140,16 +140,14 @@ jobs: - name: Set Version of Pydantic run: poetry add pydantic@${{ matrix.pydantic-version }} working-directory: python_sdk/ - - name: "Build Test Image" - run: "invoke test.build" - - name: "Pull External Docker Images" - run: "invoke test.pull" + - name: "Install Package" + run: "poetry install" - name: "Pylint Tests" - run: "invoke sdk.pylint --docker" + run: "invoke sdk.pylint" - name: "Mypy Tests" - run: "invoke sdk.mypy --docker" + run: "invoke sdk.mypy" - name: "Unit Tests" - run: "invoke sdk.test-unit" + run: "poetry run pytest -v --cov=infrahub_sdk tests/unit" env: BUILDKITE_ANALYTICS_TOKEN: ${{ secrets.BUILDKITE_SDK_UNIT }} - name: "Coveralls : Unit Tests" From 4270426f1284a8b6f8a61f3e9dd8d7efee50dba0 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 05:21:18 +0100 Subject: [PATCH 068/446] Execute pylint and mypy without invoke --- .github/workflows/ci.yml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index ebee22a38c..2944b8945a 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -143,9 +143,9 @@ jobs: - name: "Install Package" run: "poetry install" - name: "Pylint Tests" - run: "invoke sdk.pylint" + run: "poetry run pylint infrahub_sdk/ infrahub_ctl/" - name: "Mypy Tests" - run: "invoke sdk.mypy" + run: "poetry run mypy --show-error-codes infrahub_sdk/ infrahub_ctl/" - name: "Unit Tests" run: "poetry run pytest -v --cov=infrahub_sdk tests/unit" env: From 8ee2b674201e2fcf91458519bb565bc7bcd8bfaf Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 05:26:24 +0100 Subject: [PATCH 069/446] Set default working directory for sdk tests --- .github/workflows/ci.yml | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 2944b8945a..cf8ccdc707 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -126,6 +126,9 @@ jobs: timeout-minutes: 30 env: INFRAHUB_DB_TYPE: memgraph + defaults: + run: + working-directory: python_sdk/ steps: - name: "Check out repository code" uses: "actions/checkout@v3" @@ -139,7 +142,6 @@ jobs: pip install invoke toml - name: Set Version of Pydantic run: poetry add pydantic@${{ matrix.pydantic-version }} - working-directory: python_sdk/ - name: "Install Package" run: "poetry install" - name: "Pylint Tests" From b1358d434eba0870fcd6da0cda5cb8337cb02735 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 05:35:52 +0100 Subject: [PATCH 070/446] Disable poetry venv for the SDK --- .github/workflows/ci.yml | 1 + 1 file changed, 1 insertion(+) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index cf8ccdc707..20963938de 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -139,6 +139,7 @@ jobs: - name: "Setup environment" run: | pipx install poetry + poetry config virtualenvs.create false pip install invoke toml - name: Set Version of Pydantic run: poetry add pydantic@${{ matrix.pydantic-version }} From 4e1f63ec6bdada847c2788e062b161b9b8dbe86c Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 05:52:11 +0100 Subject: [PATCH 071/446] switch back to venv with active version of python --- .github/workflows/ci.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 20963938de..b353c51b83 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -139,7 +139,7 @@ jobs: - name: "Setup environment" run: | pipx install poetry - poetry config virtualenvs.create false + poetry config virtualenvs.prefer-active-python true pip install invoke toml - name: Set Version of Pydantic run: poetry add pydantic@${{ matrix.pydantic-version }} From 2396d3668cdf0cec6009e3ad6c5aa8e0ad987d96 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 06:39:24 +0100 Subject: [PATCH 072/446] Fix missing files for pydantic --- python_sdk/infrahub_ctl/schema.py | 11 ++++++++--- python_sdk/infrahub_sdk/config.py | 5 ----- python_sdk/infrahub_sdk/data.py | 7 +++++-- python_sdk/tests/unit/sdk/test_config.py | 10 +++++++--- 4 files changed, 20 insertions(+), 13 deletions(-) diff --git a/python_sdk/infrahub_ctl/schema.py b/python_sdk/infrahub_ctl/schema.py index f031f79c0b..29c8b1ae9a 100644 --- a/python_sdk/infrahub_ctl/schema.py +++ b/python_sdk/infrahub_ctl/schema.py @@ -5,7 +5,12 @@ import typer import yaml -from pydantic import BaseModel, ValidationError + +try: + from pydantic import v1 as pydantic # type: ignore[attr-defined] +except ImportError: + import pydantic # type: ignore[no-redef] + from rich.console import Console from rich.logging import RichHandler @@ -23,7 +28,7 @@ def callback() -> None: """ -class SchemaFile(BaseModel): +class SchemaFile(pydantic.BaseModel): location: Path content: Optional[dict] = None valid: bool = True @@ -71,7 +76,7 @@ async def _load(schemas: List[Path], branch: str, log: logging.Logger) -> None: for schema_file in schemas_data: try: client.schema.validate(schema_file.content) - except ValidationError as exc: + except pydantic.ValidationError as exc: console.print(f"[red]Schema not valid, found '{len(exc.errors())}' error(s) in {schema_file.location}") has_error = True for error in exc.errors(): diff --git a/python_sdk/infrahub_sdk/config.py b/python_sdk/infrahub_sdk/config.py index 864d9ee604..7238bc9908 100644 --- a/python_sdk/infrahub_sdk/config.py +++ b/python_sdk/infrahub_sdk/config.py @@ -44,7 +44,6 @@ class Config: @pydantic.root_validator(pre=True) @classmethod - @classmethod def validate_credentials_input(cls, values: Dict[str, Any]) -> Dict[str, Any]: has_username = "username" in values has_password = "password" in values @@ -54,7 +53,6 @@ def validate_credentials_input(cls, values: Dict[str, Any]) -> Dict[str, Any]: @pydantic.root_validator(pre=True) @classmethod - @classmethod def set_custom_recorder(cls, values: Dict[str, Any]) -> Dict[str, Any]: if values.get("recorder") == RecorderType.JSON and "custom_recorder" not in values: values["custom_recorder"] = JSONRecorder() @@ -62,7 +60,6 @@ def set_custom_recorder(cls, values: Dict[str, Any]) -> Dict[str, Any]: @pydantic.root_validator(pre=True) @classmethod - @classmethod def set_transport(cls, values: Dict[str, Any]) -> Dict[str, Any]: if values.get("transport") == RequesterTransport.JSON: playback = JSONPlayback() @@ -75,7 +72,6 @@ def set_transport(cls, values: Dict[str, Any]) -> Dict[str, Any]: @pydantic.root_validator(pre=True) @classmethod - @classmethod def validate_mix_authentication_schemes(cls, values: Dict[str, Any]) -> Dict[str, Any]: if values.get("password") and values.get("api_token"): raise ValueError("Unable to combine password with token based authentication") @@ -83,7 +79,6 @@ def validate_mix_authentication_schemes(cls, values: Dict[str, Any]) -> Dict[str @pydantic.validator("address") @classmethod - @classmethod def validate_address(cls, value: str) -> str: if is_valid_url(value): return value diff --git a/python_sdk/infrahub_sdk/data.py b/python_sdk/infrahub_sdk/data.py index feead11a50..d36d96a77c 100644 --- a/python_sdk/infrahub_sdk/data.py +++ b/python_sdk/infrahub_sdk/data.py @@ -1,9 +1,12 @@ from typing import Dict -from pydantic import BaseModel +try: + from pydantic import v1 as pydantic # type: ignore[attr-defined] +except ImportError: + import pydantic # type: ignore[no-redef] -class RepositoryData(BaseModel): +class RepositoryData(pydantic.BaseModel): id: str name: str location: str diff --git a/python_sdk/tests/unit/sdk/test_config.py b/python_sdk/tests/unit/sdk/test_config.py index 469954e7e8..bb89fc2455 100644 --- a/python_sdk/tests/unit/sdk/test_config.py +++ b/python_sdk/tests/unit/sdk/test_config.py @@ -1,18 +1,22 @@ import pytest -from pydantic.error_wrappers import ValidationError + +try: + from pydantic import v1 as pydantic # type: ignore[attr-defined] +except ImportError: + import pydantic # type: ignore[no-redef] from infrahub_sdk.config import Config def test_combine_authentications(): - with pytest.raises(ValidationError) as exc: + with pytest.raises(pydantic.error_wrappers.ValidationError) as exc: Config(api_token="testing", username="test", password="testpassword") assert "Unable to combine password with token based authentication" in str(exc.value) def test_missing_password(): - with pytest.raises(ValidationError) as exc: + with pytest.raises(pydantic.error_wrappers.ValidationError) as exc: Config(username="test") assert "Both 'username' and 'password' needs to be set" in str(exc.value) From 4c6fd79a5bd8ae0ab52dc98269a192236673e34d Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 09:19:42 +0100 Subject: [PATCH 073/446] Fix infrahubctl schema --- python_sdk/infrahub_ctl/validate.py | 9 +++++++-- 1 file changed, 7 insertions(+), 2 deletions(-) diff --git a/python_sdk/infrahub_ctl/validate.py b/python_sdk/infrahub_ctl/validate.py index 7f6094be5b..4d7d5039e7 100644 --- a/python_sdk/infrahub_ctl/validate.py +++ b/python_sdk/infrahub_ctl/validate.py @@ -5,7 +5,12 @@ import typer import yaml -from pydantic import ValidationError + +try: + from pydantic import v1 as pydantic # type: ignore[attr-defined] +except ImportError: + import pydantic # type: ignore[no-redef] + from rich.console import Console from ujson import JSONDecodeError @@ -38,7 +43,7 @@ async def _schema(schema: Path) -> None: try: client.schema.validate(schema_data) - except ValidationError as exc: + except pydantic.ValidationError as exc: console.print(f"[red]Schema not valid, found '{len(exc.errors())}' error(s)") for error in exc.errors(): loc_str = [str(item) for item in error["loc"]] From 4ec3a6b9d4c0afec0fdebcfaafbc57a0c64e60fa Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 09:22:47 +0100 Subject: [PATCH 074/446] Add pytest-httpx back in dev group --- poetry.lock | 75 +++++++++++++++++++++++++++++++------------------- pyproject.toml | 1 + 2 files changed, 48 insertions(+), 28 deletions(-) diff --git a/poetry.lock b/poetry.lock index ac5a1e44bc..fd0672e03a 100644 --- a/poetry.lock +++ b/poetry.lock @@ -721,31 +721,31 @@ pytz = ">2021.1" [[package]] name = "dagit" -version = "1.5.8" +version = "1.5.9" description = "Web UI for dagster." optional = false python-versions = "*" files = [ - {file = "dagit-1.5.8-py3-none-any.whl", hash = "sha256:7fac1f954c7634d8f094587d59514d600a086943258b9ae41fbf8e73e0aae872"}, - {file = "dagit-1.5.8.tar.gz", hash = "sha256:66df7996cb8dbbc8574206d7f269e2c277930e23a10c81d4f37c61f4885bd715"}, + {file = "dagit-1.5.9-py3-none-any.whl", hash = "sha256:a01b8439bf15340449514e612763dd9e16c3afba6990f2d57566a6af23f22d40"}, + {file = "dagit-1.5.9.tar.gz", hash = "sha256:1a584e9bb82596861e1c0f20176f3310421c6e11754c87236080c64b3b60760e"}, ] [package.dependencies] -dagster-webserver = "1.5.8" +dagster-webserver = "1.5.9" [package.extras] -notebook = ["dagster-webserver[notebook] (==1.5.8)"] -test = ["dagster-webserver[test] (==1.5.8)"] +notebook = ["dagster-webserver[notebook] (==1.5.9)"] +test = ["dagster-webserver[test] (==1.5.9)"] [[package]] name = "dagster" -version = "1.5.8" +version = "1.5.9" description = "Dagster is an orchestration platform for the development, production, and observation of data assets." optional = false python-versions = "*" files = [ - {file = "dagster-1.5.8-py3-none-any.whl", hash = "sha256:d4aa3d3b76c8add5bdbe16a749c722bd5ef684138892084767610b1b87422b6f"}, - {file = "dagster-1.5.8.tar.gz", hash = "sha256:98ad30e7bc45f6b04a3d0d4672270566d22d401ab43c0d2c647fb78864954ba2"}, + {file = "dagster-1.5.9-py3-none-any.whl", hash = "sha256:3364c2d70bf5b361d5de988ae353420738ea43732013bb62c40cc14d8b414511"}, + {file = "dagster-1.5.9.tar.gz", hash = "sha256:83016ec3e56ad65ee7fd4580c10b833f32eb425533d4c6b94dc00ec063080f83"}, ] [package.dependencies] @@ -753,13 +753,13 @@ alembic = ">=1.2.1,<1.6.3 || >1.6.3,<1.7.0 || >1.7.0,<1.11.0 || >1.11.0" click = ">=5.0" coloredlogs = ">=6.1,<=14.0" croniter = ">=0.3.34" -dagster-pipes = "1.5.8" +dagster-pipes = "1.5.9" docstring-parser = "*" grpcio = ">=1.44.0" grpcio-health-checking = ">=1.44.0" Jinja2 = "*" packaging = ">=20.9" -pendulum = "<3" +pendulum = ">=0.7.0,<3" protobuf = ">=3.20.0" psutil = {version = ">=1.0", markers = "platform_system == \"Windows\""} pydantic = ">1.10.0,<1.10.7 || >1.10.7" @@ -788,17 +788,17 @@ test = ["buildkite-test-collector", "docker", "grpcio-tools (>=1.44.0)", "mock ( [[package]] name = "dagster-graphql" -version = "1.5.8" +version = "1.5.9" description = "The GraphQL frontend to python dagster." optional = false python-versions = "*" files = [ - {file = "dagster-graphql-1.5.8.tar.gz", hash = "sha256:04d67fb76db7d75e0d7e839c1191214543c983dcb7085d5e7ba3b9d5573c81f7"}, - {file = "dagster_graphql-1.5.8-py3-none-any.whl", hash = "sha256:408d709e3c2c83e5abbf5416ef1bbde23adbdee93e6938fedd2681a566dd521b"}, + {file = "dagster-graphql-1.5.9.tar.gz", hash = "sha256:2acd14f39eca5f07a202bb56d6c79c049d9160c6917575ccfaab0f5da518092c"}, + {file = "dagster_graphql-1.5.9-py3-none-any.whl", hash = "sha256:46b13bdb7b39403b3bc7843a4700fb2bb713a95338dc8b98967915be23433c37"}, ] [package.dependencies] -dagster = "1.5.8" +dagster = "1.5.9" gql = {version = ">=3.0.0", extras = ["requests"]} graphene = ">=3" requests = "*" @@ -806,30 +806,30 @@ starlette = "*" [[package]] name = "dagster-pipes" -version = "1.5.8" +version = "1.5.9" description = "Toolkit for Dagster integrations with transform logic outside of Dagster" optional = false python-versions = "*" files = [ - {file = "dagster-pipes-1.5.8.tar.gz", hash = "sha256:b632a5aad45f6fc788731c6ef3b0afb167299dd5910c1212fbedcf5595ca11ff"}, - {file = "dagster_pipes-1.5.8-py3-none-any.whl", hash = "sha256:719201a63193a67294bde2936852ef848b97dbaadcd56e1316fa3eb08afef7e2"}, + {file = "dagster-pipes-1.5.9.tar.gz", hash = "sha256:66019cb851b8a7866f6240b9d9e7ed57749bc7e76e8e780d3eb2dbce8b65a864"}, + {file = "dagster_pipes-1.5.9-py3-none-any.whl", hash = "sha256:e59ba9aefa96bf1400660c0df8b9dab7401121fc3c54805601f2c8afedf42d0e"}, ] [[package]] name = "dagster-webserver" -version = "1.5.8" +version = "1.5.9" description = "Web UI for dagster." optional = false python-versions = "*" files = [ - {file = "dagster_webserver-1.5.8-py3-none-any.whl", hash = "sha256:5a9df81ea4417b0a22a7a2b7926c9ed4ce4513fe20ce43c1cd791ddaa2290329"}, - {file = "dagster_webserver-1.5.8.tar.gz", hash = "sha256:e80576b9e18f0a8781c2cfee17ec4027c5a54f4f9edc87aa15f9e1a28e4d9cd0"}, + {file = "dagster_webserver-1.5.9-py3-none-any.whl", hash = "sha256:d23f6fb7f2ab5605a5ef08c60d6f637d844766f4924d612848186a8f71779c1f"}, + {file = "dagster_webserver-1.5.9.tar.gz", hash = "sha256:914f8ba897b0fb7b0862dad1b86610277ba257f15bc083484058e1c49f0ed268"}, ] [package.dependencies] click = ">=7.0,<9.0" -dagster = "1.5.8" -dagster-graphql = "1.5.8" +dagster = "1.5.9" +dagster-graphql = "1.5.9" starlette = "*" uvicorn = {version = "*", extras = ["standard"]} @@ -885,13 +885,13 @@ dev = ["PyTest", "PyTest-Cov", "bump2version (<1)", "sphinx (<2)", "tox"] [[package]] name = "diffsync" -version = "1.9.0" +version = "1.10.0" description = "Library to easily sync/diff/update 2 different data sources" optional = false -python-versions = ">=3.7,<4.0" +python-versions = ">=3.8,<4.0" files = [ - {file = "diffsync-1.9.0-py3-none-any.whl", hash = "sha256:9abb7fbff9dfe2c7581d4e9700d16b158b454b3e4b1ef74a46c6ba27fe84958b"}, - {file = "diffsync-1.9.0.tar.gz", hash = "sha256:fe0332436b395d69262ad0711c0c050d9400504bdb8628c12248a4819a6364d7"}, + {file = "diffsync-1.10.0-py3-none-any.whl", hash = "sha256:f4368c97162d51eecc7a8e87026c731197a694026cabcf2ab4f16d18d7bdadbd"}, + {file = "diffsync-1.10.0.tar.gz", hash = "sha256:a9d7cb8e8ce983b446bf858c1c5c82cf473fcf231db73c0855e8c59ee7cd8370"}, ] [package.dependencies] @@ -899,6 +899,7 @@ colorama = ">=0.4.3,<0.5.0" packaging = ">=21.3,<24.0" pydantic = ">=1.7.4,<1.8 || >1.8,<1.8.1 || >1.8.1,<2.0.0" structlog = ">=20.1.0,<23.0.0" +typing-extensions = {version = ">=4.0.1", markers = "python_version < \"3.11\""} [package.extras] redis = ["redis (>=4.3,<5.0)"] @@ -3167,6 +3168,24 @@ pytest = ">=4.6" [package.extras] testing = ["fields", "hunter", "process-tests", "pytest-xdist", "six", "virtualenv"] +[[package]] +name = "pytest-httpx" +version = "0.21.3" +description = "Send responses to httpx." +optional = false +python-versions = ">=3.7" +files = [ + {file = "pytest_httpx-0.21.3-py3-none-any.whl", hash = "sha256:50b52b910f6f6cfb0aa65039d6f5bedb6ae3a0c02a98c4a7187543fe437c428a"}, + {file = "pytest_httpx-0.21.3.tar.gz", hash = "sha256:edcb62baceffbd57753c1a7afc4656b0e71e91c7a512e143c0adbac762d979c1"}, +] + +[package.dependencies] +httpx = "==0.23.*" +pytest = ">=6.0,<8.0" + +[package.extras] +testing = ["pytest-asyncio (==0.20.*)", "pytest-cov (==4.*)"] + [[package]] name = "pytest-xdist" version = "3.4.0" @@ -4666,4 +4685,4 @@ testing = ["coverage (>=5.0.3)", "zope.event", "zope.testing"] [metadata] lock-version = "2.0" python-versions = "^3.8, < 3.12" -content-hash = "af3ae745d04c78b74b6f6f2b83f5bf9f52fae58deb21988a8ecfc46429940607" +content-hash = "f1fa63228241738173abac806d7f5223fea4687a0e6facdc1201fd0f32dc0172" diff --git a/pyproject.toml b/pyproject.toml index f6fa6e8b7b..ad20e8d867 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -55,6 +55,7 @@ types-ujson = "*" types-pyyaml = "*" typer-cli = "*" pytest-cov = "^4.0.0" +pytest-httpx = "*" ruff = "^0.1.5" pytest-xdist = "^3.3.1" buildkite-test-collector = "^0.1.7" From c3f8e8e9d186d0c777685185c8bbd398d520a03c Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Fri, 17 Nov 2023 09:36:05 +0100 Subject: [PATCH 075/446] Remove pylint from sdk integration test and change base image to ubuntu-latest --- .github/workflows/ci.yml | 10 ++++------ 1 file changed, 4 insertions(+), 6 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index b353c51b83..649fa81e53 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -31,7 +31,7 @@ jobs: # ------------------------------------------ Check Files Changes ------------------------------------------ files-changed: name: Detect which file has changed - runs-on: ubuntu-20.04 + runs-on: ubuntu-latest timeout-minutes: 5 outputs: backend: ${{ steps.changes.outputs.backend_all }} @@ -122,7 +122,7 @@ jobs: !contains(needs.*.result, 'cancelled') && needs.files-changed.outputs.sdk == 'true' needs: ["files-changed", "yaml-lint", "python-lint"] - runs-on: "ubuntu-20.04" + runs-on: ubuntu-latest timeout-minutes: 30 env: INFRAHUB_DB_TYPE: memgraph @@ -169,7 +169,7 @@ jobs: !contains(needs.*.result, 'cancelled') && needs.files-changed.outputs.sdk == 'true' needs: ["python-sdk-unit-tests"] - runs-on: "ubuntu-20.04" + runs-on: ubuntu-latest timeout-minutes: 30 env: INFRAHUB_DB_TYPE: memgraph @@ -182,8 +182,6 @@ jobs: run: "invoke test.build" - name: "Pull External Docker Images" run: "invoke test.pull" - - name: "Pylint Tests" - run: "invoke sdk.pylint --docker" - name: "Integration Tests" run: "invoke sdk.test-integration" env: @@ -203,7 +201,7 @@ jobs: !contains(needs.*.result, 'cancelled') && needs.files-changed.outputs.sync == 'true' needs: ["files-changed", "yaml-lint", "python-lint"] - runs-on: "ubuntu-20.04" + runs-on: ubuntu-latest timeout-minutes: 30 steps: - name: "Check out repository code" From 7fb901c96757d52f876ccb19a6cf3d17a2775568 Mon Sep 17 00:00:00 2001 From: pa-lem Date: Fri, 17 Nov 2023 09:39:59 +0100 Subject: [PATCH 076/446] add data-cy --- frontend/src/components/modal-delete.tsx | 3 ++- frontend/src/components/tabs.tsx | 4 +++- .../object-item-details/relationship-details-paginated.tsx | 6 ++++-- 3 files changed, 9 insertions(+), 4 deletions(-) diff --git a/frontend/src/components/modal-delete.tsx b/frontend/src/components/modal-delete.tsx index d076fe06ed..1263e7e243 100644 --- a/frontend/src/components/modal-delete.tsx +++ b/frontend/src/components/modal-delete.tsx @@ -69,7 +69,8 @@ export default function ModalDelete(props: iProps) { onClick={onDelete} buttonType={BUTTON_TYPES.CANCEL} className="ml-2" - isLoading={isLoading}> + isLoading={isLoading} + data-cy="modal-delete-confirm"> Delete @@ -536,7 +537,8 @@ export default function RelationshipDetails(props: iRelationDetailsProps) { setShowAddDrawer(true)} - className="p-3 ml-2 bg-custom-blue-500 text-sm hover:bg-custom-blue-500 focus:ring-custom-blue-500 focus:ring-offset-gray-50 focus:ring-offset-2"> + className="p-3 ml-2 bg-custom-blue-500 text-sm hover:bg-custom-blue-500 focus:ring-custom-blue-500 focus:ring-offset-gray-50 focus:ring-offset-2" + data-cy="add-new-relationship"> From 6b672f52521ed2c5c888ad6c090a6083ed649208 Mon Sep 17 00:00:00 2001 From: Bilal Date: Fri, 17 Nov 2023 12:18:05 +0100 Subject: [PATCH 077/446] attempt to fix login flakiness --- frontend/cypress/support/e2e.ts | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/frontend/cypress/support/e2e.ts b/frontend/cypress/support/e2e.ts index a84b46f7bc..9ab77436d3 100644 --- a/frontend/cypress/support/e2e.ts +++ b/frontend/cypress/support/e2e.ts @@ -26,10 +26,11 @@ Cypress.Commands.add("login", (username: string, password: string) => { [username, password], () => { cy.visit("/signin"); + cy.contains("Sign in to your account").should("be.visible"); - cy.get(":nth-child(1) > .relative > .block").type(username, { delay: 0, force: true }); + cy.get(":nth-child(1) > .relative > .block").type(username); - cy.get(":nth-child(2) > .relative > .block").type(password, { delay: 0, force: true }); + cy.get(":nth-child(2) > .relative > .block").type(password); cy.get(".justify-end > .rounded-md").click(); }, From 4921791935c8d45a6564f6580524ae241ad328eb Mon Sep 17 00:00:00 2001 From: Bilal Date: Fri, 17 Nov 2023 12:47:42 +0100 Subject: [PATCH 078/446] boy scout: renamed modal delete's test tag + fixing typo --- frontend/src/components/modal-delete.tsx | 4 ++-- .../src/screens/branches/branch-details.tsx | 3 +-- frontend/tests/e2e/branches.cy.ts | 22 ++++++++++--------- .../object-details-relationships.cy.tsx | 2 +- 4 files changed, 16 insertions(+), 15 deletions(-) diff --git a/frontend/src/components/modal-delete.tsx b/frontend/src/components/modal-delete.tsx index 1263e7e243..20e83e100b 100644 --- a/frontend/src/components/modal-delete.tsx +++ b/frontend/src/components/modal-delete.tsx @@ -14,7 +14,7 @@ interface iProps { } export default function ModalDelete(props: iProps) { - const { title, description, onCancel, onDelete, open, setOpen, isLoading, ...otherProps } = props; + const { title, description, onCancel, onDelete, open, setOpen, isLoading } = props; const cancelButtonRef = useRef(null); return ( @@ -43,7 +43,7 @@ export default function ModalDelete(props: iProps) { leaveTo="opacity-0 translate-y-4 sm:translate-y-0 sm:scale-95"> + data-cy="modal-delete">

diff --git a/frontend/src/screens/branches/branch-details.tsx b/frontend/src/screens/branches/branch-details.tsx index fdebdaa8be..b185be0c2d 100644 --- a/frontend/src/screens/branches/branch-details.tsx +++ b/frontend/src/screens/branches/branch-details.tsx @@ -129,7 +129,7 @@ export const BranchDetails = () => { title="Delete" description={ <> - Are you sure you want to remove the the branch + Are you sure you want to remove the branch
`{branch?.name}`? } @@ -157,7 +157,6 @@ export const BranchDetails = () => { }} open={displayModal} setOpen={() => setDisplayModal(false)} - data-cy="modal-branch-delete" /> )} diff --git a/frontend/tests/e2e/branches.cy.ts b/frontend/tests/e2e/branches.cy.ts index 296174cf0b..d7dae2fe8c 100644 --- a/frontend/tests/e2e/branches.cy.ts +++ b/frontend/tests/e2e/branches.cy.ts @@ -45,11 +45,12 @@ describe("Branches creation and deletion", () => { cy.visit("/branches/test456?branch=test123"); cy.contains("button", "Delete").click(); - cy.get("[data-cy='modal-branch-delete']").contains("h3", "Delete"); - cy.get("[data-cy='modal-branch-delete']").contains( - "Are you sure you want to remove the the branch `test456`?" - ); - cy.get("[data-cy='modal-branch-delete']").contains("button", "Delete").click(); + + cy.get("[data-cy='modal-delete']").within(() => { + cy.contains("h3", "Delete").should("be.visible"); + cy.contains("Are you sure you want to remove the branch `test456`?").should("be.visible"); + cy.contains("button", "Delete").click(); + }); cy.get("[data-cy='branch-select-menu']").contains("test123"); cy.url().should("include", "/branches").and("include", "branch=test123"); @@ -63,11 +64,12 @@ describe("Branches creation and deletion", () => { cy.visit("/branches/test123?branch=test123"); cy.contains("button", "Delete").click(); - cy.get("[data-cy='modal-branch-delete']").contains("h3", "Delete"); - cy.get("[data-cy='modal-branch-delete']").contains( - "Are you sure you want to remove the the branch `test123`?" - ); - cy.get("[data-cy='modal-branch-delete']").contains("button", "Delete").click(); + + cy.get("[data-cy='modal-delete']").within(() => { + cy.contains("h3", "Delete").should("be.visible"); + cy.contains("Are you sure you want to remove the branch `test123`?").should("be.visible"); + cy.contains("button", "Delete").click(); + }); cy.get("[data-cy='branch-select-menu']").contains("main"); cy.url().should("not.include", "branch=test123"); diff --git a/frontend/tests/integrations/screens/object-details-relationships.cy.tsx b/frontend/tests/integrations/screens/object-details-relationships.cy.tsx index 5afa6a6a8f..54dad600ad 100644 --- a/frontend/tests/integrations/screens/object-details-relationships.cy.tsx +++ b/frontend/tests/integrations/screens/object-details-relationships.cy.tsx @@ -110,7 +110,7 @@ describe("List screen", () => { , { - // Add iniital route for the app router, to display the current items view + // Add initial route for the app router, to display the current items view routerProps: { initialEntries: [graphqlQueryItemsUrl], }, From b2593b683889691cf9310857aaaee6e0d29c807b Mon Sep 17 00:00:00 2001 From: Bilal Date: Fri, 17 Nov 2023 13:58:20 +0100 Subject: [PATCH 079/446] added test for Relationship creation and deletion --- .../src/components-form/select-2-step.tsx | 2 + frontend/src/components/select.tsx | 5 +- frontend/src/screens/edit-form-hook/form.tsx | 2 +- .../relationship-details-paginated.tsx | 7 +- frontend/tests/e2e/relationships.cy.ts | 81 +++++++++++++++++++ 5 files changed, 91 insertions(+), 6 deletions(-) create mode 100644 frontend/tests/e2e/relationships.cy.ts diff --git a/frontend/src/components-form/select-2-step.tsx b/frontend/src/components-form/select-2-step.tsx index 57a9efac11..743ca3337d 100644 --- a/frontend/src/components-form/select-2-step.tsx +++ b/frontend/src/components-form/select-2-step.tsx @@ -118,6 +118,7 @@ export const OpsSelect2Step = (props: Props) => { setSelectedLeft(options.filter((option) => option.id === value.id)[0]); }} isProtected={isProtected} + data-cy="select2step-1" />
@@ -137,6 +138,7 @@ export const OpsSelect2Step = (props: Props) => { }); }} isProtected={isProtected} + data-cy="select2step-2" /> )}
diff --git a/frontend/src/components/select.tsx b/frontend/src/components/select.tsx index 09efed5246..68ca085b3e 100644 --- a/frontend/src/components/select.tsx +++ b/frontend/src/components/select.tsx @@ -25,7 +25,7 @@ type SelectProps = { }; export const Select = (props: SelectProps) => { - const { options, value, onChange, disabled, error, direction } = props; + const { options, value, onChange, disabled, error, direction, ...otherProps } = props; const [query, setQuery] = useState(""); @@ -55,7 +55,8 @@ export const Select = (props: SelectProps) => { setSelectedOption(item); onChange(item); }} - disabled={disabled}> + disabled={disabled} + {...otherProps}>
+
diff --git a/frontend/src/screens/object-item-details/relationship-details-paginated.tsx b/frontend/src/screens/object-item-details/relationship-details-paginated.tsx index 0b646e5177..c53ab7cc85 100644 --- a/frontend/src/screens/object-item-details/relationship-details-paginated.tsx +++ b/frontend/src/screens/object-item-details/relationship-details-paginated.tsx @@ -362,7 +362,8 @@ export default function RelationshipDetails(props: iRelationDetailsProps) { ) } key={index} - className="hover:bg-gray-50 cursor-pointer"> + className="hover:bg-gray-50 cursor-pointer" + data-cy="relationship-row"> {newColumns?.map((column) => ( { setRelatedRowToDelete(node); }} - data-cy="remove-relationship"> + data-cy="relationship-delete-button"> @@ -538,7 +539,7 @@ export default function RelationshipDetails(props: iRelationDetailsProps) { disabled={!auth?.permissions?.write} onClick={() => setShowAddDrawer(true)} className="p-3 ml-2 bg-custom-blue-500 text-sm hover:bg-custom-blue-500 focus:ring-custom-blue-500 focus:ring-offset-gray-50 focus:ring-offset-2" - data-cy="add-new-relationship"> + data-cy="open-relationship-form-button">
diff --git a/frontend/tests/e2e/relationships.cy.ts b/frontend/tests/e2e/relationships.cy.ts new file mode 100644 index 0000000000..9e085e2bdd --- /dev/null +++ b/frontend/tests/e2e/relationships.cy.ts @@ -0,0 +1,81 @@ +/// + +import { ADMIN_CREDENTIALS } from "../utils"; + +describe("Relationship Page", () => { + it("should display object relationships without login", () => { + cy.visit("/objects/InfraDevice"); + cy.contains("atl1-edge1").click(); + + cy.contains("button", "Edit").should("be.disabled"); + cy.contains("button", "Manage groups").should("be.disabled"); + + cy.contains("Artifacts").click(); + cy.url().should("include", "tab=artifacts"); + cy.get("[data-cy='relationship-row']").should("have.length", 2); + + cy.contains("Interfaces").click(); + cy.url().should("include", "tab=interfaces"); + cy.get("[data-cy='relationship-row']").should("have.length", 10); + cy.contains("Showing 1 to 10 of 14 results").should("exist"); + cy.get("[data-cy='metadata-edit-button']").should("be.disabled"); + cy.get("[data-cy='relationship-delete-button']").should("be.disabled"); + cy.get("[data-cy='open-relationship-form-button']").should("be.disabled"); + }); + + it("should create a new relationship", () => { + cy.login(ADMIN_CREDENTIALS.username, ADMIN_CREDENTIALS.password); + cy.visit("/objects/InfraDevice"); + cy.contains("atl1-edge1").click(); + cy.contains("Interfaces").click(); + + cy.get("[data-cy='open-relationship-form-button']").click(); + cy.contains("Add associated Interfaces").should("be.visible"); + + // Form; + cy.get("[data-cy='form']").within(() => { + // fill 1st select by typing + cy.get("[data-cy='select2step-2']").should("not.exist"); + cy.get("[data-cy='select2step-1']").type("Int"); + cy.contains("InterfaceL2").click(); + + // fill 2md select with click only + cy.get("[data-cy='select2step-2']").should("be.visible"); + cy.get("[data-cy='select2step-2'] button").click(); + cy.contains("Ethernet11").click(); + + cy.get("[data-cy='submit-form']").click(); + }); + + cy.contains("Association with InfraInterface added").should("be.visible"); + cy.get("[data-cy='relationship-row']").contains("Ethernet11").should("be.visible"); + }); + + it("should delete the newly created relationship", () => { + cy.login(ADMIN_CREDENTIALS.username, ADMIN_CREDENTIALS.password); + cy.visit("/objects/InfraDevice"); + cy.contains("atl1-edge1").click(); + cy.contains("Interfaces").click(); + + // get delete button from row containing Ethernet11 + cy.get("[data-cy='relationship-row']") + .contains(/^Ethernet11$/) + .parent() + .within(() => { + cy.get("[data-cy='relationship-delete-button']").click(); + }); + + // Modal delete + cy.get("[data-cy='modal-delete']").within(() => { + cy.contains( + "Are you sure you want to remove the association between `atl1-edge1` and `Ethernet11`? The `InfraInterfaceL2` `Ethernet11` won't be deleted in the process." + ).should("be.visible"); + cy.contains("button", "Delete").click(); + }); + + // after delete + cy.contains("Item removed from the group").should("be.visible"); + cy.get("[data-cy='modal-delete']").should("not.exist"); + cy.get("[data-cy='relationship-row']").should("not.contain", /^Ethernet11$/); + }); +}); From 9fe573a5ef0596dd85b8e2bcfbdee799afed7d40 Mon Sep 17 00:00:00 2001 From: pa-lem Date: Fri, 17 Nov 2023 17:04:24 +0100 Subject: [PATCH 080/446] add debounce to time selector and fix some logic --- frontend/src/screens/layout/header.tsx | 37 +++++++++++++++----------- frontend/src/utils/common.ts | 19 +++++++++++++ 2 files changed, 40 insertions(+), 16 deletions(-) diff --git a/frontend/src/screens/layout/header.tsx b/frontend/src/screens/layout/header.tsx index d1f82c3cd0..d857cde971 100644 --- a/frontend/src/screens/layout/header.tsx +++ b/frontend/src/screens/layout/header.tsx @@ -2,7 +2,7 @@ import { gql, useReactiveVar } from "@apollo/client"; import { Menu, Transition } from "@headlessui/react"; import { MagnifyingGlassIcon } from "@heroicons/react/20/solid"; import { Bars3BottomLeftIcon } from "@heroicons/react/24/outline"; -import { formatISO, isEqual } from "date-fns"; +import { formatISO, isEqual, isValid } from "date-fns"; import { useAtom } from "jotai"; import React, { Fragment, useContext, useEffect } from "react"; import { Link, useNavigate } from "react-router-dom"; @@ -19,8 +19,7 @@ import { getProfileDetails } from "../../graphql/queries/profile/getProfileDetai import { dateVar } from "../../graphql/variables/dateVar"; import useQuery from "../../hooks/useQuery"; import { schemaState } from "../../state/atoms/schema.atom"; -import { classNames, parseJwt } from "../../utils/common"; -import LoadingScreen from "../loading-screen/loading-screen"; +import { classNames, debounce, parseJwt } from "../../utils/common"; import { userNavigation } from "./navigation-list"; interface Props { @@ -60,6 +59,11 @@ export default function Header(props: Props) { const { error, loading, data } = useQuery(query, { skip: !schema || !accountId }); useEffect(() => { + // Remove the date from the state + if (!qspDate || (qspDate && !isValid(new Date(qspDate)))) { + dateVar(null); + } + if (qspDate) { const newQspDate = new Date(qspDate); @@ -68,11 +72,6 @@ export default function Header(props: Props) { dateVar(newQspDate); } } - - // Remove the date from the state - if (!qspDate) { - dateVar(null); - } }, [date, qspDate]); const handleDateChange = (newDate: any) => { @@ -84,18 +83,20 @@ export default function Header(props: Props) { } }; + const debouncedHandleDateChange = debounce(handleDateChange); + const handleClickNow = () => { // Undefined is needed to remove a parameter from the QSP setQspDate(undefined); }; - if (loading || !schema) { - return ( -
- -
- ); - } + // if (loading || !schema) { + // return ( + //
+ // + //
+ // ); + // } const profile = data?.AccountProfile; @@ -147,7 +148,11 @@ export default function Header(props: Props) {
- +
diff --git a/frontend/src/utils/common.ts b/frontend/src/utils/common.ts index 1d6c57f267..bf02d9ad06 100644 --- a/frontend/src/utils/common.ts +++ b/frontend/src/utils/common.ts @@ -42,3 +42,22 @@ export const encodeJwt = (data: any): string => { // Add "." to be decoded by parseJwt return `.${btoa(JSON.stringify(data))}`; }; + +const DEFAULT_DEBOUNCE = 1000; + +export const debounce = (func: any, wait = DEFAULT_DEBOUNCE, immediate?: boolean) => { + let timeout: any; + return function executedFunction(this: any) { + const context = this; + // eslint-disable-next-line prefer-rest-params + const args = arguments; + const later = () => { + timeout = null; + if (!immediate) func.apply(context, args); + }; + const callNow = immediate && !timeout; + clearTimeout(timeout); + timeout = setTimeout(later, wait); + if (callNow) func.apply(context, args); + }; +}; From f391d7a6bc1301ffe403bb326c2da1e69e06009e Mon Sep 17 00:00:00 2001 From: pa-lem Date: Fri, 17 Nov 2023 17:21:44 +0100 Subject: [PATCH 081/446] fetch menu on branch change --- frontend/src/config/config.ts | 3 ++- frontend/src/screens/layout/desktop-menu.tsx | 8 ++++++-- 2 files changed, 8 insertions(+), 3 deletions(-) diff --git a/frontend/src/config/config.ts b/frontend/src/config/config.ts index b21f20adb0..b152d5ee9b 100644 --- a/frontend/src/config/config.ts +++ b/frontend/src/config/config.ts @@ -35,5 +35,6 @@ export const CONFIG = { FILES_CONTENT_URL: (repositoryId: string, location: string) => `${INFRAHUB_API_SERVER_URL}/api/file/${repositoryId}/${encodeURIComponent(location)}`, STORAGE_DETAILS_URL: (id: string) => `${INFRAHUB_API_SERVER_URL}/api/storage/object/${id}`, - MENU_URL: `${INFRAHUB_API_SERVER_URL}/api/menu`, + MENU_URL: (branch?: string) => + `${INFRAHUB_API_SERVER_URL}/api/menu${branch ? `?branch=${branch}` : ""}`, }; diff --git a/frontend/src/screens/layout/desktop-menu.tsx b/frontend/src/screens/layout/desktop-menu.tsx index 7971a33aca..467143d318 100644 --- a/frontend/src/screens/layout/desktop-menu.tsx +++ b/frontend/src/screens/layout/desktop-menu.tsx @@ -1,8 +1,10 @@ +import { useReactiveVar } from "@apollo/client"; import { useEffect, useState } from "react"; import { useNavigate } from "react-router-dom"; import { toast } from "react-toastify"; import { ALERT_TYPES, Alert } from "../../components/alert"; import { CONFIG } from "../../config/config"; +import { branchVar } from "../../graphql/variables/branchVar"; import logo from "../../images/Infrahub-SVG-hori.svg"; import { fetchUrl } from "../../utils/fetch"; import LoadingScreen from "../loading-screen/loading-screen"; @@ -12,6 +14,8 @@ import { Footer } from "./footer"; export default function DesktopMenu() { const navigate = useNavigate(); + const branch = useReactiveVar(branchVar); + const [isLoading, setIsLoading] = useState(false); const [menu, setMenu] = useState([]); @@ -19,7 +23,7 @@ export default function DesktopMenu() { try { setIsLoading(true); - const result = await fetchUrl(CONFIG.MENU_URL); + const result = await fetchUrl(CONFIG.MENU_URL(branch?.name)); setMenu(result); @@ -33,7 +37,7 @@ export default function DesktopMenu() { useEffect(() => { fecthMenu(); - }, []); + }, [branch?.name]); return (
From abeafc590dc832e5e89adb7d843adc96e02e288e Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Thu, 16 Nov 2023 14:48:32 +0100 Subject: [PATCH 082/446] Initialize git-agent with services object Make the services object available throughout the git-agent. The main reason for this is to be able to initialize repositories and supply the service object. This PR is related to the changes in #1429. In this iteration the code to initialize RabbitMQ has just been copied as they were before. A future step will be to have the RabbitMQ adapter take care of this for both the git-agent and api-server and remove this code from the git-agent all together. --- backend/infrahub/cli/git_agent.py | 50 +++++++++++++++++------------- backend/infrahub/git/actions.py | 21 ++++++++----- backend/infrahub/git/repository.py | 12 ++++--- 3 files changed, 50 insertions(+), 33 deletions(-) diff --git a/backend/infrahub/cli/git_agent.py b/backend/infrahub/cli/git_agent.py index c114135a4b..2717199841 100644 --- a/backend/infrahub/cli/git_agent.py +++ b/backend/infrahub/cli/git_agent.py @@ -45,7 +45,7 @@ def callback() -> None: """ -async def subscribe_rpcs_queue(client: InfrahubClient) -> None: +async def subscribe_rpcs_queue(service: InfrahubServices) -> None: """Subscribe to the RPCs queue and execute the corresponding action when a valid RPC is received.""" # TODO generate an exception if the broker is not properly configured # and return a proper message to the user @@ -58,20 +58,6 @@ async def subscribe_rpcs_queue(client: InfrahubClient) -> None: ) events_queue = await channel.declare_queue(name=f"worker-events-{WORKER_IDENTITY}", exclusive=True) - exchange = await channel.declare_exchange(f"{config.SETTINGS.broker.namespace}.events", type="topic", durable=True) - await events_queue.bind(exchange, routing_key="refresh.registry.*") - delayed_exchange = await channel.get_exchange(name=f"{config.SETTINGS.broker.namespace}.delayed") - driver = await get_db() - database = InfrahubDatabase(driver=driver) - service = InfrahubServices( - cache=RedisCache(), - client=client, - database=database, - message_bus=RabbitMQMessageBus(channel=channel, exchange=exchange, delayed_exchange=delayed_exchange), - ) - async with service.database.start_session() as db: - await initialization(db=db) - worker_callback = WorkerCallback(service=service) await events_queue.consume(worker_callback.run_command, no_ack=True) log.info("Waiting for RPC instructions to execute .. ") @@ -93,19 +79,19 @@ async def subscribe_rpcs_queue(client: InfrahubClient) -> None: log.exception("Processing error for message %r" % message) -async def initialize_git_agent(client: InfrahubClient) -> None: +async def initialize_git_agent(service: InfrahubServices) -> None: log.info("Initializing Git Agent ...") initialize_repositories_directory() # TODO Validate access to the GraphQL API with the proper credentials - await sync_remote_repositories(client=client) + await sync_remote_repositories(service=service) -async def monitor_remote_activity(client: InfrahubClient, interval: int) -> None: +async def monitor_remote_activity(service: InfrahubServices, interval: int) -> None: log.info("Monitoring remote repository for updates .. ") while True: - await sync_remote_repositories(client=client) + await sync_remote_repositories(service=service) await asyncio.sleep(interval) @@ -128,11 +114,31 @@ async def _start(debug: bool, interval: int, port: int) -> None: # Initialize the lock initialize_lock() - await initialize_git_agent(client=client) + connection = await get_broker() + + # Create a channel and subscribe to the incoming RPC queue + channel = await connection.channel() + events_queue = await channel.declare_queue(name=f"worker-events-{WORKER_IDENTITY}", exclusive=True) + + exchange = await channel.declare_exchange(f"{config.SETTINGS.broker.namespace}.events", type="topic", durable=True) + await events_queue.bind(exchange, routing_key="refresh.registry.*") + delayed_exchange = await channel.get_exchange(name=f"{config.SETTINGS.broker.namespace}.delayed") + driver = await get_db() + database = InfrahubDatabase(driver=driver) + service = InfrahubServices( + cache=RedisCache(), + client=client, + database=database, + message_bus=RabbitMQMessageBus(channel=channel, exchange=exchange, delayed_exchange=delayed_exchange), + ) + await initialize_git_agent(service=service) + + async with service.database.start_session() as db: + await initialization(db=db) tasks = [ - asyncio.create_task(subscribe_rpcs_queue(client=client)), - asyncio.create_task(monitor_remote_activity(client=client, interval=interval)), + asyncio.create_task(subscribe_rpcs_queue(service=service)), + asyncio.create_task(monitor_remote_activity(service=service, interval=interval)), ] await asyncio.gather(*tasks) diff --git a/backend/infrahub/git/actions.py b/backend/infrahub/git/actions.py index 8751d28859..35aa3fcc00 100644 --- a/backend/infrahub/git/actions.py +++ b/backend/infrahub/git/actions.py @@ -1,25 +1,28 @@ import logging -from infrahub_sdk import InfrahubClient - from infrahub import lock from infrahub.exceptions import RepositoryError +from infrahub.services import InfrahubServices from .repository import InfrahubRepository LOGGER = logging.getLogger("infrahub.git") -async def sync_remote_repositories(client: InfrahubClient) -> None: - branches = await client.branch.all() - repositories = await client.get_list_repositories(branches=branches) +async def sync_remote_repositories(service: InfrahubServices) -> None: + branches = await service.client.branch.all() + repositories = await service.client.get_list_repositories(branches=branches) for repo_name, repository in repositories.items(): async with lock.registry.get(name=repo_name, namespace="repository"): init_failed = False try: repo = await InfrahubRepository.init( - id=repository.id, name=repository.name, location=repository.location, client=client + service=service, + id=repository.id, + name=repository.name, + location=repository.location, + client=service.client, ) except RepositoryError as exc: LOGGER.error(exc) @@ -28,7 +31,11 @@ async def sync_remote_repositories(client: InfrahubClient) -> None: if init_failed: try: repo = await InfrahubRepository.new( - id=repository.id, name=repository.name, location=repository.location, client=client + service=service, + id=repository.id, + name=repository.name, + location=repository.location, + client=service.client, ) await repo.import_objects_from_files(branch_name=repo.default_branch_name) except RepositoryError as exc: diff --git a/backend/infrahub/git/repository.py b/backend/infrahub/git/repository.py index 5c0b256edd..24abd21350 100644 --- a/backend/infrahub/git/repository.py +++ b/backend/infrahub/git/repository.py @@ -40,6 +40,7 @@ TransformError, ) from infrahub.log import get_logger +from infrahub.services import InfrahubServices if TYPE_CHECKING: from infrahub_sdk.branch import BranchData @@ -340,6 +341,7 @@ class InfrahubRepository(BaseModel): # pylint: disable=too-many-public-methods client: Optional[InfrahubClient] cache_repo: Optional[Repo] + service: InfrahubServices class Config: arbitrary_types_allowed = True @@ -513,15 +515,17 @@ async def create_locally(self) -> bool: return True @classmethod - async def new(cls, **kwargs): - self = cls(**kwargs) + async def new(cls, service: Optional[InfrahubServices] = None, **kwargs): + service = service or InfrahubServices() + self = cls(service=service, **kwargs) await self.create_locally() LOGGER.info(f"{self.name} | Created the new project locally.") return self @classmethod - async def init(cls, **kwargs): - self = cls(**kwargs) + async def init(cls, service: Optional[InfrahubServices] = None, **kwargs): + service = service or InfrahubServices() + self = cls(service=service, **kwargs) self.validate_local_directories() LOGGER.debug(f"{self.name} | Initiated the object on an existing directory.") return self From 5034342b9baa0c9d534cb21153469c891caf156a Mon Sep 17 00:00:00 2001 From: Bilal Date: Sat, 18 Nov 2023 01:04:05 +0100 Subject: [PATCH 083/446] ignore jetBrains IDE files --- .gitignore | 1 + 1 file changed, 1 insertion(+) diff --git a/.gitignore b/.gitignore index 8c367a87dd..7d12c780d7 100644 --- a/.gitignore +++ b/.gitignore @@ -11,6 +11,7 @@ development/docker-compose.dev-override.yml .python-version .ruff_cache **/.ruff_cache +**/.idea/** # Direnv files (https://direnv.net/) .direnv/ From 8cb401d789b330cf42bcc587ca2a9380dd78a359 Mon Sep 17 00:00:00 2001 From: Bilal Date: Sat, 18 Nov 2023 01:04:16 +0100 Subject: [PATCH 084/446] added .editorconfig --- .editorconfig | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) create mode 100644 .editorconfig diff --git a/.editorconfig b/.editorconfig new file mode 100644 index 0000000000..54f5bf8ab3 --- /dev/null +++ b/.editorconfig @@ -0,0 +1,17 @@ +# https://editorconfig.org +# Top-most EditorConfig file +root = true + +[*] +charset = utf-8 +end_of_line = lf +insert_final_newline = true +indent_size = 2 +indent_style = space +trim_trailing_whitespace = true + +[*.py] +indent_size = 4 + +[*.md] +trim_trailing_whitespace = false From 1d6563213b6aaa2516858495e859a353c6c4a70e Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Sun, 19 Nov 2023 11:46:34 +0100 Subject: [PATCH 085/446] Update memgraph version to 2.12.1 --- tasks/shared.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tasks/shared.py b/tasks/shared.py index e3f5446950..51adc39609 100644 --- a/tasks/shared.py +++ b/tasks/shared.py @@ -19,7 +19,7 @@ class DatabaseType(str, Enum): INFRAHUB_DATABASE = os.getenv("INFRAHUB_DB_TYPE", DatabaseType.NEO4J.value) DATABASE_DOCKER_IMAGE = os.getenv("DATABASE_DOCKER_IMAGE", None) -MEMGRAPH_DOCKER_IMAGE = os.getenv("MEMGRAPH_DOCKER_IMAGE", "memgraph/memgraph:2.11.0") +MEMGRAPH_DOCKER_IMAGE = os.getenv("MEMGRAPH_DOCKER_IMAGE", "memgraph/memgraph:2.12.1") NEO4J_DOCKER_IMAGE = os.getenv("NEO4J_DOCKER_IMAGE", "neo4j:5.13-community") MESSAGE_QUEUE_DOCKER_IMAGE = os.getenv("MESSAGE_QUEUE_DOCKER_IMAGE", "rabbitmq:3.12-management") CACHE_DOCKER_IMAGE = os.getenv("CACHE_DOCKER_IMAGE", "redis:7.2") From 47a8e98a533428ac20a5484124141c26915e4424 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Sun, 19 Nov 2023 11:47:32 +0100 Subject: [PATCH 086/446] Set Matrix for E2E tests to valiadate both memgraph and neo4j --- .github/workflows/ci.yml | 70 +++------------------------------------- 1 file changed, 5 insertions(+), 65 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index a0282ba7fd..339ec36e72 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -360,70 +360,7 @@ jobs: run: "invoke docs.build" # ------------------------------------------ E2E Tests ------------------------------------------ - # E2E-testing-memgraph: - # needs: ["frontend-tests", "backend-tests-default", "python-sdk-tests"] - # if: | - # always() && !cancelled() && - # !contains(needs.*.result, 'failure') && - # !contains(needs.*.result, 'cancelled') - # runs-on: "runner-ubuntu-8-32" - # timeout-minutes: 30 - # steps: - # - name: "Check out repository code" - # uses: "actions/checkout@v3" - # - name: Install NodeJS - # uses: actions/setup-node@v3 - # with: - # node-version: 16 - # cache: 'npm' - # cache-dependency-path: frontend/package-lock.json - # - name: Install frontend dependencies - # working-directory: ./frontend - # run: npm install - # - name: "Install Invoke" - # run: "pip install toml invoke" - # - name: Build Demo - # run: "invoke demo.build" - # - name: "Pull External Docker Images" - # run: "invoke demo.pull" - # - name: Initialize Demo - # id: init-demo - # run: "invoke demo.start demo.load-infra-schema" - # - name: Check Demo Status - # run: "invoke demo.status" - # - name: Load Data - # run: "invoke demo.load-infra-data" - # - name: Git Repository - # run: "invoke demo.infra-git-import demo.infra-git-create" - # - name: Run End to End Tests - # working-directory: ./frontend - # run: npm run cypress:run:e2e - # - name: Containers after failure - # if: failure() - # run: docker ps -a - # - name: Upload cypress screenshots - # if: failure() - # uses: actions/upload-artifact@v3 - # with: - # name: screenshots - # path: docs/media/* - # - name: Display server logs - # if: failure() - # run: docker logs infrahub-infrahub-server-1 - # - name: Display git 1 logs - # if: failure() - # run: docker logs infrahub-infrahub-git-1 - # - name: Display git 2 logs - # if: failure() - # run: docker logs infrahub-infrahub-git-2 - # - name: Display database logs - # if: failure() - # run: docker logs infrahub-database-1 - # - name: Display server status - # if: failure() - # run: invoke demo.status - - E2E-testing-neo4j: + E2E-testing: needs: - javascript-lint - files-changed @@ -433,10 +370,13 @@ jobs: always() && !cancelled() && !contains(needs.*.result, 'failure') && !contains(needs.*.result, 'cancelled') + strategy: + matrix: + python-version: ["neo4j", "memgraph"] runs-on: "runner-ubuntu-8-32" timeout-minutes: 40 env: - INFRAHUB_DB_TYPE: neo4j + INFRAHUB_DB_TYPE: ${{ matrix.python-version }} steps: - name: "Check out repository code" uses: "actions/checkout@v3" From 6d97da82ac43f325109f61fd3a7d97daf1fc2ab2 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Thu, 16 Nov 2023 09:21:36 +0100 Subject: [PATCH 087/446] Move repository check imports to .infrahub.yml --- backend/infrahub/git/repository.py | 115 +++-- .../repos/infrahub-demo-edge/.gitignore | 4 + .../repos/infrahub-demo-edge/.infrahub.yml | 35 ++ .../checks/check_backbone_link_redundancy.gql | 45 ++ .../checks/check_backbone_link_redundancy.py | 35 ++ .../schemas/demo_edge_fabric.yml | 26 + .../templates/device_startup_config.tpl.j2 | 111 ++++ .../templates/device_startup_info.gql | 77 +++ .../infrahub-demo-edge/tests/conftest.py | 56 ++ .../oc_bgp_neighbors/test01/data.json | 481 ++++++++++++++++++ .../oc_bgp_neighbors/test01/response.json | 105 ++++ .../fixtures/oc_interfaces/test01/data.json | 275 ++++++++++ .../oc_interfaces/test01/response.json | 226 ++++++++ .../integration/graphql/test_graphql_query.py | 20 + .../transforms/test_openconfig_integration.py | 24 + .../unit/transforms/test_openconfig_unit.py | 13 + .../topology/topology_info.gql | 36 ++ .../transforms/oc_bgp_neighbors.gql | 47 ++ .../transforms/oc_interfaces.gql | 35 ++ .../transforms/openconfig.py | 83 +++ backend/tests/integration/git/conftest.py | 14 +- .../integration/git/test_git_repository.py | 9 +- python_sdk/infrahub_sdk/schema.py | 7 +- 23 files changed, 1831 insertions(+), 48 deletions(-) create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/.gitignore create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/.infrahub.yml create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/checks/check_backbone_link_redundancy.gql create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/checks/check_backbone_link_redundancy.py create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/schemas/demo_edge_fabric.yml create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/templates/device_startup_config.tpl.j2 create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/templates/device_startup_info.gql create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/tests/conftest.py create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_bgp_neighbors/test01/data.json create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_bgp_neighbors/test01/response.json create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_interfaces/test01/data.json create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_interfaces/test01/response.json create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/tests/integration/graphql/test_graphql_query.py create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/tests/integration/transforms/test_openconfig_integration.py create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/tests/unit/transforms/test_openconfig_unit.py create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/topology/topology_info.gql create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/transforms/oc_bgp_neighbors.gql create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/transforms/oc_interfaces.gql create mode 100644 backend/tests/fixtures/repos/infrahub-demo-edge/transforms/openconfig.py diff --git a/backend/infrahub/git/repository.py b/backend/infrahub/git/repository.py index 24abd21350..3e878441e7 100644 --- a/backend/infrahub/git/repository.py +++ b/backend/infrahub/git/repository.py @@ -1024,8 +1024,10 @@ async def import_objects_from_files(self, branch_name: str, commit: Optional[str await self.import_schema_files(branch_name=branch_name, commit=commit) await self.import_all_graphql_query(branch_name=branch_name, commit=commit) - await self.import_all_python_files(branch_name=branch_name, commit=commit) await self.import_all_yaml_files(branch_name=branch_name, commit=commit) + config_file = await self.get_repository_config(branch_name=branch_name, commit=commit) + if config_file: + await self.import_all_python_files(branch_name=branch_name, commit=commit, config_file=config_file) async def import_objects_rfiles(self, branch_name: str, commit: str, data: List[dict]): LOGGER.debug(f"{self.name} | Importing all RFiles in branch {branch_name} ({commit}) ") @@ -1370,44 +1372,51 @@ async def create_graphql_query(self, branch_name: str, name: str, query_string: await obj.save() return obj - async def import_python_check_definitions_from_module( - self, branch_name: str, commit: str, module: types.ModuleType, file_path: str + async def import_python_check_definitions( + self, branch_name: str, commit: str, config_file: InfrahubRepositoryConfig ) -> None: - if INFRAHUB_CHECK_VARIABLE_TO_IMPORT not in dir(module): - return False + commit_wt = self.get_worktree(identifier=commit) + branch_wt = self.get_worktree(identifier=commit or branch_name) - checks_definition_in_graph = { - check.name.value: check - for check in await self.client.filters( - kind="CoreCheckDefinition", branch=branch_name, repository__ids=[str(self.id)] - ) - } + # Ensure the path for this repository is present in sys.path + if self.directory_root not in sys.path: + sys.path.append(self.directory_root) - local_check_definitions = {} - for check_class in getattr(module, INFRAHUB_CHECK_VARIABLE_TO_IMPORT): - graphql_query = await self.client.get( - kind="CoreGraphQLQuery", branch=branch_name, id=str(check_class.query), populate_store=True + checks = [] + for check in config_file.check_definitions: + LOGGER.debug(self.name, import_type="check_definition", file=check.file_path) + + file_info = extract_repo_file_information( + full_filename=os.path.join(branch_wt.directory, check.file_path.as_posix()), + repo_directory=self.directory_root, + worktree_directory=commit_wt.directory, ) try: - item = CheckDefinitionInformation( - name=check_class.__name__, - repository=str(self.id), - class_name=check_class.__name__, - check_class=check_class, - file_path=file_path, - query=str(graphql_query.id), - timeout=check_class.timeout, - rebase=check_class.rebase, - ) - local_check_definitions[item.name] = item - except Exception as exc: # pylint: disable=broad-exception-caught - LOGGER.error( - f"{self.name} | An error occured while processing the CheckDefinition {check_class.__name__} from {file_path} : {exc} " + module = importlib.import_module(file_info.module_name) + except ModuleNotFoundError as exc: + LOGGER.warning( + self.name, import_type="check_definition", file=check.file_path.as_posix(), error=str(exc) ) continue + checks.extend( + await self.get_check_definitions( + branch_name=branch_name, + module=module, + file_path=file_info.relative_path_file, + ) + ) + + local_check_definitions = {check.name: check for check in checks} + check_definition_in_graph = { + check.name.value: check + for check in await self.client.filters( + kind="CoreCheckDefinition", branch=branch_name, repository__ids=[str(self.id)] + ) + } + present_in_both, only_graph, only_local = compare_lists( - list1=list(checks_definition_in_graph.keys()), list2=list(local_check_definitions.keys()) + list1=list(check_definition_in_graph.keys()), list2=list(local_check_definitions.keys()) ) for check_name in only_local: @@ -1421,21 +1430,53 @@ async def import_python_check_definitions_from_module( for check_name in present_in_both: if not await self.compare_python_check_definition( check=local_check_definitions[check_name], - existing_check=checks_definition_in_graph[check_name], + existing_check=check_definition_in_graph[check_name], ): LOGGER.info( f"{self.name} | New version of CheckDefinition '{check_name}' found on branch {branch_name} ({commit[:8]}), updating" ) await self.update_python_check_definition( check=local_check_definitions[check_name], - existing_check=checks_definition_in_graph[check_name], + existing_check=check_definition_in_graph[check_name], ) for check_name in only_graph: LOGGER.info( f"{self.name} | CheckDefinition '{check_name}' not found locally in branch {branch_name}, deleting" ) - await checks_definition_in_graph[check_name].delete() + await check_definition_in_graph[check_name].delete() + + async def get_check_definitions( + self, branch_name: str, module: types.ModuleType, file_path: str + ) -> List[CheckDefinitionInformation]: + if INFRAHUB_CHECK_VARIABLE_TO_IMPORT not in dir(module): + return [] + + checks = [] + for check_class in getattr(module, INFRAHUB_CHECK_VARIABLE_TO_IMPORT): + graphql_query = await self.client.get( + kind="CoreGraphQLQuery", branch=branch_name, id=str(check_class.query), populate_store=True + ) + try: + checks.append( + CheckDefinitionInformation( + name=check_class.__name__, + repository=str(self.id), + class_name=check_class.__name__, + check_class=check_class, + file_path=file_path, + query=str(graphql_query.id), + timeout=check_class.timeout, + rebase=check_class.rebase, + ) + ) + + except Exception as exc: # pylint: disable=broad-exception-caught + LOGGER.error( + f"{self.name} | An error occured while processing the CheckDefinition {check_class.__name__} from {file_path} : {exc} " + ) + continue + return checks async def create_python_check_definition(self, branch_name: str, check: CheckDefinitionInformation) -> InfrahubNode: data = { @@ -1656,7 +1697,8 @@ async def import_all_yaml_files(self, branch_name: str, commit: str, exclude: Op method = getattr(self, f"import_objects_{key}") await method(branch_name=branch_name, commit=commit, data=data) - async def import_all_python_files(self, branch_name: str, commit: str): + async def import_all_python_files(self, branch_name: str, commit: str, config_file: InfrahubRepositoryConfig): + await self.import_python_check_definitions(branch_name=branch_name, commit=commit, config_file=config_file) commit_wt = self.get_worktree(identifier=commit) python_files = await self.find_files(extension=["py"], commit=commit) @@ -1678,9 +1720,6 @@ async def import_all_python_files(self, branch_name: str, commit: str): LOGGER.warning(f"{self.name} | Unable to load python file {python_file}") continue - await self.import_python_check_definitions_from_module( - branch_name=branch_name, commit=commit, module=module, file_path=file_info.relative_path_file - ) await self.import_python_transforms_from_module( branch_name=branch_name, commit=commit, module=module, file_path=file_info.relative_path_file ) @@ -1778,7 +1817,7 @@ async def execute_python_check( module = importlib.import_module(file_info.module_name) - check_class = getattr(module, class_name) + check_class: InfrahubCheck = getattr(module, class_name) check = await check_class.init(root_directory=commit_worktree.directory, branch=branch_name, client=client) await check.run() diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/.gitignore b/backend/tests/fixtures/repos/infrahub-demo-edge/.gitignore new file mode 100644 index 0000000000..999ed626d9 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/.gitignore @@ -0,0 +1,4 @@ +.vscode/* +*.pyc +*.tar.gz +.DS_Store diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/.infrahub.yml b/backend/tests/fixtures/repos/infrahub-demo-edge/.infrahub.yml new file mode 100644 index 0000000000..3a77213c65 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/.infrahub.yml @@ -0,0 +1,35 @@ +--- +schemas: + - schemas/demo_edge_fabric.yml + +rfiles: + - name: device_startup + description: "Template to generate startup configuration for network devices" + query: "device_startup_info" + repository: "self" + template_path: "templates/device_startup_config.tpl.j2" + + - name: clab_topology + query: "topology_info" + repository: "self" + template_path: "topology/topology.tpl.j2" + +artifact_definitions: + - name: "Openconfig Interface for Arista devices" + artifact_name: "openconfig-interfaces" + parameters: + device: "name__value" + content_type: "application/json" + targets: "arista_devices" + transformation: "OCInterfaces" + + - name: "Startup Config for Edge devices" + artifact_name: "startup-config" + parameters: + device: "name__value" + content_type: "text/plain" + targets: "edge_router" + transformation: "device_startup" + +check_definitions: + - file_path: "checks/check_backbone_link_redundancy.py" diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/checks/check_backbone_link_redundancy.gql b/backend/tests/fixtures/repos/infrahub-demo-edge/checks/check_backbone_link_redundancy.gql new file mode 100644 index 0000000000..fd57c582c9 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/checks/check_backbone_link_redundancy.gql @@ -0,0 +1,45 @@ +query check_backbone_link_redundancy { + InfraCircuit(role__name__value: "backbone") { + edges { + node { + id + circuit_id { + value + } + vendor_id { + value + } + status { + node { + name { + value + } + } + } + endpoints { + edges { + node { + site { + node { + id + name { + value + } + } + } + connected_endpoint { + node { + ... on InfraInterface { + enabled { + value + } + } + } + } + } + } + } + } + } + } +} diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/checks/check_backbone_link_redundancy.py b/backend/tests/fixtures/repos/infrahub-demo-edge/checks/check_backbone_link_redundancy.py new file mode 100644 index 0000000000..ee494da5a5 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/checks/check_backbone_link_redundancy.py @@ -0,0 +1,35 @@ +from collections import defaultdict + +from infrahub.checks import InfrahubCheck + + +class InfrahubCheckBackboneLinkRedundancy(InfrahubCheck): + query = "check_backbone_link_redundancy" + + def validate(self): + site_id_by_name = {} + + backbone_links_per_site = defaultdict(lambda: defaultdict(int)) + + for circuit in self.data["data"]["circuit"]: + status = circuit["status"]["name"]["value"] + + for endpoint in circuit["endpoints"]: + site_name = endpoint["site"]["name"]["value"] + site_id_by_name[site_name] = endpoint["site"]["id"] + backbone_links_per_site[site_name]["total"] += 1 + if endpoint["connected_interface"]["enabled"]["value"] and status == "active": + backbone_links_per_site[site_name]["operational"] += 1 + + for site_name, site in backbone_links_per_site.items(): + if site.get("operational", 0) / site["total"] < 0.6: + self.log_error( + message=f"{site_name} has less than 60% of backbone circuit operational ({site.get('operational', 0)}/{site['total']})", + object_id=site_id_by_name[site_name], + object_type="site", + ) + + # rprint(backbone_links_per_site) + + +INFRAHUB_CHECKS = [InfrahubCheckBackboneLinkRedundancy] diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/schemas/demo_edge_fabric.yml b/backend/tests/fixtures/repos/infrahub-demo-edge/schemas/demo_edge_fabric.yml new file mode 100644 index 0000000000..72d9a0ed7f --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/schemas/demo_edge_fabric.yml @@ -0,0 +1,26 @@ +# yaml-language-server: $schema=https://schema.infrahub.app/develop/schema.schema.json +--- +version: '1.0' +nodes: + - name: EdgeFabric + namespace: Demo + description: "." + label: "EdgeFabric" + default_filter: name__value + display_labels: + - name__value + attributes: + - name: name + kind: Text + # unique: true + - name: description + kind: Text + optional: true + - name: nbr_racks + kind: Number + relationships: + - name: tags + peer: BuiltinTag + optional: true + cardinality: many + kind: Attribute diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/templates/device_startup_config.tpl.j2 b/backend/tests/fixtures/repos/infrahub-demo-edge/templates/device_startup_config.tpl.j2 new file mode 100644 index 0000000000..76f07d25d6 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/templates/device_startup_config.tpl.j2 @@ -0,0 +1,111 @@ +{% set ns = namespace(loopback_intf_name=none, loopback_ip=none, management_intf_name=none, management_ip=none) %} +{% for intf in data.InfraDevice.edges[0].node.interfaces.edges %} +{% if intf.node.role.node.name.value == "loopback" %} +{% set ns.loopback_intf_name = intf.node.name.value %} +{% set ns.loopback_ip = intf.node.ip_addresses.edges[0].node.address.value.split('/')[0] %} +{% elif intf.node.role.node.name.value == "management" %} +{% set ns.management_intf_name = intf.node.name.value %} +{% set ns.management_ip = intf.node.ip_addresses.edges[0].node.address.value.split('/')[0] %} +{% endif %} +{% endfor %} +no aaa root +! +username admin privilege 15 role network-admin secret sha512 $6$q4ez.aZgB/G/eeWW$ukvRobb5RtYmUlCcY0atxhwPmA6FPoRjR3AxYFJqNFoCRgJjrohKGrBsbY12n1uRZeCer1L8oejx5aPlrf.op0 +! +transceiver qsfp default-mode 4x10G +! +service routing protocols model multi-agent +! +hostname {{ data.InfraDevice.edges[0].node.name.value }} +! +spanning-tree mode mstp +! +management api http-commands + no shutdown +! +management api gnmi + transport grpc default +! +management api netconf + transport ssh default +! +{% for intf in data.InfraDevice.edges[0].node.interfaces.edges %} +{% if intf.node.name.value != ns.management_intf_name and intf.node.name.value != ns.loopback_intf_name %} +interface {{ intf.node.name.value }} +{% if intf.node["description"]["value"] %} + description {{ intf.node["description"]["value"] }} +{% else %} + description role: {{ intf.node.role.node.name.value }} +{% endif %} +{% if not intf.node["enabled"]["value"] %} + shutdown +{% endif %} + +{% if intf.node["ip_addresses"] %} +{% for ip in intf.node["ip_addresses"]["edges"] %} + ip address {{ ip.node["address"]["value"] }} + no switchport +{% if intf.node.role.node.name.value == "peer" or intf.node.role.node.name.value == "backbone" %} + ip ospf network point-to-point +{% endif %} +{% endfor %} +{% endif %} +! +{% endif %} +{% endfor %} +! +interface {{ ns.management_intf_name }} +{% for intf in data.InfraDevice.edges[0]["interfaces"] %} +{% if intf.node.name.value == ns.management_intf_name %} +{% for ip in intf["ip_addresses"] %} + ip address {{ ip["address"]["value"] }} +{% endfor %} +{% endif %} +{% endfor %} +! +interface {{ ns.loopback_intf_name }} +{% for intf in data.InfraDevice.edges[0]["interfaces"] %} +{% if intf.node.name.value == ns.loopback_intf_name %} +{% for ip in intf["ip_addresses"] %} + ip address {{ ip["address"]["value"] }} +{% endfor %} +{% endif %} +{% endfor %} +! +ip prefix-list BOGON-Prefixes seq 10 permit 172.16.0.0/12 le 24 +ip prefix-list BOGON-Prefixes seq 20 permit 192.168.0.0/16 le 24 +ip prefix-list BOGON-Prefixes seq 10 permit 172.16.0.0/12 le 24 +ip prefix-list BOGON-Prefixes seq 20 permit 192.168.0.0/16 le 24 +! +ip routing +! +ip route 0.0.0.0/0 172.20.20.1 +! +{% if data.InfraDevice.edges[0].node.asn %} +router bgp {{ data.InfraDevice.edges[0].node.asn.node.asn.value }} + router-id {{ loopback_ip }} +{% for peer_group in data.InfraBGPPeerGroup.edges %} + neighbor {{ peer_group.node.name.value }} peer group +{% if peer_group.node.local_as %} + neighbor {{ peer_group.node.name.value }} local-as {{ peer_group.node.local_as.node.asn.value }} +{% endif %} +{% if peer_group.node.remote_as and peer_group.node.remote_as.node %} + neighbor {{ peer_group.node.name.value }} remote-as {{ peer_group.node.remote_as.node.asn.value }} +{% endif %} +{% endfor %} +! +{% endif %} +! +router ospf 1 + router-id {{ loopback_ip }} + redistribute connected + max-lsa 12000 + passive-interface Loopback0 + network 0.0.0.0/0 area 0.0.0.0 +! +route-map BOGONS permit 10 + match ip address prefix-list BOGON-Prefixes +! +route-map BOGONS deny 20 +! +end diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/templates/device_startup_info.gql b/backend/tests/fixtures/repos/infrahub-demo-edge/templates/device_startup_info.gql new file mode 100644 index 0000000000..902b10ce76 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/templates/device_startup_info.gql @@ -0,0 +1,77 @@ +query($device: String!) { + InfraDevice(name__value: $device) { + edges { + node { + id + name { + value + } + asn { + node { + asn { + value + } + } + } + interfaces { + edges { + node { + id + name { + value + } + description { + value + } + enabled { + value + } + role { + node { + name { + value + } + } + } + ... on InfraInterfaceL3 { + ip_addresses { + edges { + node { + address { + value + } + } + } + } + } + } + } + } + + } + } + } + InfraBGPPeerGroup { + edges { + node { + name { + value + } + local_as { + node { + asn { + value + } + } + } + remote_as { + node { + asn { + value + } + } + } + } + } + } +} \ No newline at end of file diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/tests/conftest.py b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/conftest.py new file mode 100644 index 0000000000..7cced50646 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/conftest.py @@ -0,0 +1,56 @@ +import os +from pathlib import Path +from typing import Tuple + +import pytest +import ujson +from infrahub_sdk import InfrahubClientSync + + +class TestHelper: + """TestHelper profiles functions that can be used as a fixture throughout the test framework""" + + @staticmethod + def fixture_file(file_name: str) -> dict: + """Return the contents of a fixture file as a dictionary""" + file_content = Path(os.path.join(TestHelper.get_fixtures_dir(), file_name)).read_text() + + return ujson.loads(file_content) + + @staticmethod + def fixture_files(directory_name: str) -> Tuple[dict, dict]: + """Return the contents of a schema file as a dictionary""" + + data_file = TestHelper.fixture_file(os.path.join(directory_name, "data.json")) + + if "data" in data_file: + data_file = data_file["data"] + + response_file = TestHelper.fixture_file(os.path.join(directory_name, "response.json")) + + return (data_file, response_file) + + @staticmethod + def get_fixtures_dir(): + """Get the directory which stores fixtures that are common to multiple unit/integration tests.""" + here = os.path.abspath(os.path.dirname(__file__)) + fixtures_dir = os.path.join(here, "fixtures") + + return os.path.abspath(fixtures_dir) + + +@pytest.fixture() +def root_directory() -> str: + here = os.path.abspath(os.path.dirname(__file__)) + root_dir = os.path.join(here, "../") + return os.path.abspath(root_dir) + + +@pytest.fixture() +def helper() -> TestHelper: + return TestHelper() + + +@pytest.fixture() +def client_sync() -> InfrahubClientSync: + return InfrahubClientSync.init(address="http://localhost:8000", insert_tracker=True) diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_bgp_neighbors/test01/data.json b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_bgp_neighbors/test01/data.json new file mode 100644 index 0000000000..e5d470b133 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_bgp_neighbors/test01/data.json @@ -0,0 +1,481 @@ +{ + "data": { + "bgp_session": { + "edges": [ + { + "node": { + "id": "d0d47169-b1f3-4255-ab86-c06b39ea8d84", + "peer_group": { + "node": { + "name": { + "value": "TRANSIT_TELIA" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "203.0.113.9/29" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "203.0.113.10/29" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 1299 + } + } + }, + "description": { + "value": null + } + } + }, + { + "node": { + "id": "d2e49bff-ddf2-484b-a4f5-f1ac076a2a2e", + "peer_group": { + "node": { + "name": { + "value": "TRANSIT_DEFAULT" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "203.0.113.49/29" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "203.0.113.50/29" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 8220 + } + } + }, + "description": { + "value": null + } + } + }, + { + "node": { + "id": "066cb468-3229-4595-a4a4-a88b3bef082c", + "peer_group": { + "node": { + "name": { + "value": "POP_INTERNAL" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "10.0.0.2/32" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "10.0.0.7/32" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "description": { + "value": null + } + } + }, + { + "node": { + "id": "4ece88b2-665c-4929-ac3e-96430f91974a", + "peer_group": { + "node": { + "name": { + "value": "POP_GLOBAL" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "10.0.0.2/32" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "10.0.0.3/32" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "description": { + "value": null + } + } + }, + { + "node": { + "id": "4b566879-b47d-452a-9b5a-2e2cee9190b7", + "peer_group": { + "node": { + "name": { + "value": "POP_GLOBAL" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "10.0.0.2/32" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "10.0.0.8/32" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "description": { + "value": null + } + } + }, + { + "node": { + "id": "f315994a-0caa-4677-b65d-d9dda58b06f3", + "peer_group": { + "node": { + "name": { + "value": "POP_GLOBAL" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "10.0.0.2/32" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "10.0.0.4/32" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "description": { + "value": null + } + } + }, + { + "node": { + "id": "1815f356-9f87-41c8-b817-42ecf6501628", + "peer_group": { + "node": { + "name": { + "value": "POP_GLOBAL" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "10.0.0.2/32" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "10.0.0.5/32" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "description": { + "value": null + } + } + }, + { + "node": { + "id": "8c921dee-fe45-48c7-a940-ef10a2384a19", + "peer_group": { + "node": { + "name": { + "value": "POP_GLOBAL" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "10.0.0.2/32" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "10.0.0.1/32" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "description": { + "value": null + } + } + }, + { + "node": { + "id": "69a3dc11-5440-403c-8013-40d6d0ebad0a", + "peer_group": { + "node": { + "name": { + "value": "POP_GLOBAL" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "10.0.0.2/32" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "10.0.0.6/32" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "description": { + "value": null + } + } + }, + { + "node": { + "id": "3c82dabf-977b-4fce-b9a1-2f5f3be52a18", + "peer_group": { + "node": { + "name": { + "value": "POP_GLOBAL" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "10.0.0.2/32" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "10.0.0.9/32" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "description": { + "value": null + } + } + }, + { + "node": { + "id": "207ea49c-6af7-4a92-9a99-3b38d73d0487", + "peer_group": { + "node": { + "name": { + "value": "POP_GLOBAL" + } + } + }, + "local_ip": { + "node": { + "address": { + "value": "10.0.0.2/32" + } + } + }, + "remote_ip": { + "node": { + "address": { + "value": "10.0.0.10/32" + } + } + }, + "local_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "remote_as": { + "node": { + "asn": { + "value": 64496 + } + } + }, + "description": { + "value": null + } + } + } + ] + } + } + } \ No newline at end of file diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_bgp_neighbors/test01/response.json b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_bgp_neighbors/test01/response.json new file mode 100644 index 0000000000..53ddae5fbc --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_bgp_neighbors/test01/response.json @@ -0,0 +1,105 @@ +{ + "openconfig-bgp:neighbors": { + "neighbor": [ + { + "neighbor-address": "203.0.113.10", + "config": { + "neighbor-address": "203.0.113.10", + "peer-group": "TRANSIT_TELIA", + "peer-as": 1299, + "local-as": 64496 + } + }, + { + "neighbor-address": "203.0.113.50", + "config": { + "neighbor-address": "203.0.113.50", + "peer-group": "TRANSIT_DEFAULT", + "peer-as": 8220, + "local-as": 64496 + } + }, + { + "neighbor-address": "10.0.0.7", + "config": { + "neighbor-address": "10.0.0.7", + "peer-group": "POP_INTERNAL", + "peer-as": 64496, + "local-as": 64496 + } + }, + { + "neighbor-address": "10.0.0.3", + "config": { + "neighbor-address": "10.0.0.3", + "peer-group": "POP_GLOBAL", + "peer-as": 64496, + "local-as": 64496 + } + }, + { + "neighbor-address": "10.0.0.8", + "config": { + "neighbor-address": "10.0.0.8", + "peer-group": "POP_GLOBAL", + "peer-as": 64496, + "local-as": 64496 + } + }, + { + "neighbor-address": "10.0.0.4", + "config": { + "neighbor-address": "10.0.0.4", + "peer-group": "POP_GLOBAL", + "peer-as": 64496, + "local-as": 64496 + } + }, + { + "neighbor-address": "10.0.0.5", + "config": { + "neighbor-address": "10.0.0.5", + "peer-group": "POP_GLOBAL", + "peer-as": 64496, + "local-as": 64496 + } + }, + { + "neighbor-address": "10.0.0.1", + "config": { + "neighbor-address": "10.0.0.1", + "peer-group": "POP_GLOBAL", + "peer-as": 64496, + "local-as": 64496 + } + }, + { + "neighbor-address": "10.0.0.6", + "config": { + "neighbor-address": "10.0.0.6", + "peer-group": "POP_GLOBAL", + "peer-as": 64496, + "local-as": 64496 + } + }, + { + "neighbor-address": "10.0.0.9", + "config": { + "neighbor-address": "10.0.0.9", + "peer-group": "POP_GLOBAL", + "peer-as": 64496, + "local-as": 64496 + } + }, + { + "neighbor-address": "10.0.0.10", + "config": { + "neighbor-address": "10.0.0.10", + "peer-group": "POP_GLOBAL", + "peer-as": 64496, + "local-as": 64496 + } + } + ] + } +} \ No newline at end of file diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_interfaces/test01/data.json b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_interfaces/test01/data.json new file mode 100644 index 0000000000..563f822403 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_interfaces/test01/data.json @@ -0,0 +1,275 @@ +{ + "data": { + "device": { + "edges": [ + { + "node": { + "id": "67bbcb4f-5ee7-4c9b-a598-58985ff3dbd4", + "interfaces": { + "edges": [ + { + "node": { + "name": { + "value": "Ethernet2" + }, + "description": { + "value": "Connected to ord1-edge2 Ethernet2" + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [] + } + } + }, + { + "node": { + "name": { + "value": "Ethernet3" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [] + } + } + }, + { + "node": { + "name": { + "value": "Ethernet12" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + } + } + }, + { + "node": { + "name": { + "value": "Ethernet4" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [] + } + } + }, + { + "node": { + "name": { + "value": "Ethernet6" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [ + { + "node": { + "address": { + "value": "203.0.113.49/29" + } + } + } + ] + } + } + }, + { + "node": { + "name": { + "value": "Ethernet10" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [] + } + } + }, + { + "node": { + "name": { + "value": "Ethernet1" + }, + "description": { + "value": "Connected to ord1-edge2 Ethernet1" + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [] + } + } + }, + { + "node": { + "name": { + "value": "Ethernet11" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + } + } + }, + { + "node": { + "name": { + "value": "Ethernet9" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [ + { + "node": { + "address": { + "value": "203.0.113.81/29" + } + } + } + ] + } + } + }, + { + "node": { + "name": { + "value": "Loopback0" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [ + { + "node": { + "address": { + "value": "10.0.0.2/32" + } + } + } + ] + } + } + }, + { + "node": { + "name": { + "value": "Ethernet5" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [ + { + "node": { + "address": { + "value": "203.0.113.9/29" + } + } + } + ] + } + } + }, + { + "node": { + "name": { + "value": "Ethernet7" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [] + } + } + }, + { + "node": { + "name": { + "value": "Management0" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [ + { + "node": { + "address": { + "value": "172.20.20.18/24" + } + } + } + ] + } + } + }, + { + "node": { + "name": { + "value": "Ethernet8" + }, + "description": { + "value": null + }, + "enabled": { + "value": true + }, + "ip_addresses": { + "edges": [] + } + } + } + ] + } + } + } + ] + } + } + } \ No newline at end of file diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_interfaces/test01/response.json b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_interfaces/test01/response.json new file mode 100644 index 0000000000..9dc5318f1e --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/fixtures/oc_interfaces/test01/response.json @@ -0,0 +1,226 @@ +{ + "openconfig-interfaces:interface": [ + { + "name": "Ethernet2", + "config": { + "enabled": true, + "description": "Connected to ord1-edge2 Ethernet2" + }, + "subinterfaces": { + "subinterface": [] + } + }, + { + "name": "Ethernet3", + "config": { + "enabled": true + }, + "subinterfaces": { + "subinterface": [] + } + }, + { + "name": "Ethernet12", + "config": { + "enabled": true + } + }, + { + "name": "Ethernet4", + "config": { + "enabled": true + }, + "subinterfaces": { + "subinterface": [] + } + }, + { + "name": "Ethernet6", + "config": { + "enabled": true + }, + "subinterfaces": { + "subinterface": [ + { + "index": 0, + "openconfig-if-ip:ipv4": { + "addresses": { + "address": [ + { + "ip": "203.0.113.49", + "config": { + "ip": "203.0.113.49", + "prefix-length": "29" + } + } + ] + }, + "config": { + "enabled": true + } + } + } + ] + } + }, + { + "name": "Ethernet10", + "config": { + "enabled": true + }, + "subinterfaces": { + "subinterface": [] + } + }, + { + "name": "Ethernet1", + "config": { + "enabled": true, + "description": "Connected to ord1-edge2 Ethernet1" + }, + "subinterfaces": { + "subinterface": [] + } + }, + { + "name": "Ethernet11", + "config": { + "enabled": true + } + }, + { + "name": "Ethernet9", + "config": { + "enabled": true + }, + "subinterfaces": { + "subinterface": [ + { + "index": 0, + "openconfig-if-ip:ipv4": { + "addresses": { + "address": [ + { + "ip": "203.0.113.81", + "config": { + "ip": "203.0.113.81", + "prefix-length": "29" + } + } + ] + }, + "config": { + "enabled": true + } + } + } + ] + } + }, + { + "name": "Loopback0", + "config": { + "enabled": true + }, + "subinterfaces": { + "subinterface": [ + { + "index": 0, + "openconfig-if-ip:ipv4": { + "addresses": { + "address": [ + { + "ip": "10.0.0.2", + "config": { + "ip": "10.0.0.2", + "prefix-length": "32" + } + } + ] + }, + "config": { + "enabled": true + } + } + } + ] + } + }, + { + "name": "Ethernet5", + "config": { + "enabled": true + }, + "subinterfaces": { + "subinterface": [ + { + "index": 0, + "openconfig-if-ip:ipv4": { + "addresses": { + "address": [ + { + "ip": "203.0.113.9", + "config": { + "ip": "203.0.113.9", + "prefix-length": "29" + } + } + ] + }, + "config": { + "enabled": true + } + } + } + ] + } + }, + { + "name": "Ethernet7", + "config": { + "enabled": true + }, + "subinterfaces": { + "subinterface": [] + } + }, + { + "name": "Management0", + "config": { + "enabled": true + }, + "subinterfaces": { + "subinterface": [ + { + "index": 0, + "openconfig-if-ip:ipv4": { + "addresses": { + "address": [ + { + "ip": "172.20.20.18", + "config": { + "ip": "172.20.20.18", + "prefix-length": "24" + } + } + ] + }, + "config": { + "enabled": true + } + } + } + ] + } + }, + { + "name": "Ethernet8", + "config": { + "enabled": true + }, + "subinterfaces": { + "subinterface": [] + } + } + ] +} \ No newline at end of file diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/tests/integration/graphql/test_graphql_query.py b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/integration/graphql/test_graphql_query.py new file mode 100644 index 0000000000..ff6f68e5d2 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/integration/graphql/test_graphql_query.py @@ -0,0 +1,20 @@ +import pytest +from infrahub_ctl.utils import find_graphql_query +from infrahub_sdk import InfrahubClientSync + + +@pytest.mark.parametrize( + "query_name,variables", + [ + ("device_startup_info", {"device": "ord1-edge1"}), + ("oc_interfaces", {"device": "ord1-edge1"}), + ("oc_bgp_neighbors", {"device": "ord1-edge1"}), + ("topology_info", {}), + ("check_backbone_link_redundancy", {}), + ], +) +def test_graphql_queries(root_directory, client_sync: InfrahubClientSync, query_name: str, variables: dict): + query_str = find_graphql_query(name=query_name, directory=root_directory) + response = client_sync.execute_graphql(query=query_str, variables=variables, raise_for_error=False) + + assert "errors" not in response diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/tests/integration/transforms/test_openconfig_integration.py b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/integration/transforms/test_openconfig_integration.py new file mode 100644 index 0000000000..4c899848c3 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/integration/transforms/test_openconfig_integration.py @@ -0,0 +1,24 @@ +from infrahub_ctl.utils import find_graphql_query +from transforms.openconfig import OCBGPNeighbors, OCInterfaces + + +async def test_oc_interfaces_standard(helper, root_directory, client_sync): + transform = OCInterfaces() + query = find_graphql_query(name=transform.query, directory=root_directory) + data = client_sync.execute_graphql(query=query, variables={"device": "ord1-edge1"}) + + response = await transform.transform(data=data) + assert "openconfig-interfaces:interface" in response + assert len(response["openconfig-interfaces:interface"]) > 2 + + +async def test_oc_bgp_neighbors_standard(helper, root_directory, client_sync): + transform = OCBGPNeighbors() + query = find_graphql_query(name=transform.query, directory=root_directory) + data = client_sync.execute_graphql(query=query, variables={"device": "ord1-edge1"}) + + response = await transform.transform(data=data) + + assert "openconfig-bgp:neighbors" in response + assert "neighbor" in response["openconfig-bgp:neighbors"] + assert len(response["openconfig-bgp:neighbors"]["neighbor"]) > 2 diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/tests/unit/transforms/test_openconfig_unit.py b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/unit/transforms/test_openconfig_unit.py new file mode 100644 index 0000000000..d11bf8178e --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/tests/unit/transforms/test_openconfig_unit.py @@ -0,0 +1,13 @@ +from transforms.openconfig import OCBGPNeighbors, OCInterfaces + + +async def test_oc_interfaces_standard(helper): + data, response = helper.fixture_files(directory_name="oc_interfaces/test01") + transform = OCInterfaces() + assert await transform.transform(data=data) == response + + +async def test_oc_bgp_neighbors_standard(helper): + data, response = helper.fixture_files(directory_name="oc_bgp_neighbors/test01") + transform = OCBGPNeighbors() + assert await transform.transform(data=data) == response diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/topology/topology_info.gql b/backend/tests/fixtures/repos/infrahub-demo-edge/topology/topology_info.gql new file mode 100644 index 0000000000..93c1f455fb --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/topology/topology_info.gql @@ -0,0 +1,36 @@ +query { + InfraDevice { + edges { + node { + name { + value + } + interfaces { + edges { + node { + id + role { + node { + name { + value + } + } + } + ... on InfraInterfaceL3 { + ip_addresses { + edges { + node { + address { + value + } + } + } + } + } + } + } + } + } + } + } +} diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/transforms/oc_bgp_neighbors.gql b/backend/tests/fixtures/repos/infrahub-demo-edge/transforms/oc_bgp_neighbors.gql new file mode 100644 index 0000000000..8ffd303eda --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/transforms/oc_bgp_neighbors.gql @@ -0,0 +1,47 @@ +query oc_bgp_neighbors ($device: String!) { + InfraBGPSession(device__name__value: $device) { + edges { + node { + id + peer_group { + node { + name { + value + } + } + } + local_ip { + node { + address { + value + } + } + } + remote_ip { + node { + address { + value + } + } + } + local_as { + node { + asn { + value + } + } + } + remote_as { + node { + asn { + value + } + } + } + description { + value + } + } + } + } +} diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/transforms/oc_interfaces.gql b/backend/tests/fixtures/repos/infrahub-demo-edge/transforms/oc_interfaces.gql new file mode 100644 index 0000000000..3f91bbf8af --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/transforms/oc_interfaces.gql @@ -0,0 +1,35 @@ +query oc_interfaces ($device: String!) { + InfraDevice(name__value: $device) { + edges { + node { + id + interfaces { + edges { + node { + name { + value + } + description { + value + } + enabled { + value + } + ... on InfraInterfaceL3 { + ip_addresses { + edges { + node { + address { + value + } + } + } + } + } + } + } + } + } + } + } +} diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/transforms/openconfig.py b/backend/tests/fixtures/repos/infrahub-demo-edge/transforms/openconfig.py new file mode 100644 index 0000000000..98e75277f7 --- /dev/null +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/transforms/openconfig.py @@ -0,0 +1,83 @@ +from infrahub.transforms import InfrahubTransform + + +class OCInterfaces(InfrahubTransform): + query = "oc_interfaces" + url = "openconfig/interfaces" + + async def transform(self, data): + response_payload = {} + response_payload["openconfig-interfaces:interface"] = [] + + for intf in data["InfraDevice"]["edges"][0]["node"]["interfaces"]["edges"]: + intf_name = intf["node"]["name"]["value"] + + intf_config = { + "name": intf_name, + "config": {"enabled": intf["node"]["enabled"]["value"]}, + } + + if intf["node"].get("description", None) and intf["node"]["description"]["value"]: + intf_config["config"]["description"] = intf["node"]["description"]["value"] + + if intf["node"].get("ip_addresses", None): + intf_config["subinterfaces"] = {"subinterface": []} + + for idx, ip in enumerate(intf["node"]["ip_addresses"]["edges"]): + address, mask = ip["node"]["address"]["value"].split("/") + intf_config["subinterfaces"]["subinterface"].append( + { + "index": idx, + "openconfig-if-ip:ipv4": { + "addresses": { + "address": [ + { + "ip": address, + "config": { + "ip": address, + "prefix-length": mask, + }, + } + ] + }, + "config": {"enabled": True}, + }, + } + ) + + response_payload["openconfig-interfaces:interface"].append(intf_config) + + return response_payload + + +class OCBGPNeighbors(InfrahubTransform): + query = "oc_bgp_neighbors" + url = "openconfig/network-instances/network-instance/protocols/protocol/bgp/neighbors" + + async def transform(self, data): + response_payload = {} + + response_payload["openconfig-bgp:neighbors"] = {"neighbor": []} + + for session in data["InfraBGPSession"]["edges"]: + neighbor_address = session["node"]["remote_ip"]["node"]["address"]["value"].split("/")[0] + session_data = { + "neighbor-address": neighbor_address, + "config": {"neighbor-address": neighbor_address}, + } + + if session["node"]["peer_group"]: + session_data["config"]["peer-group"] = session["node"]["peer_group"]["node"]["name"]["value"] + + if session["node"]["remote_as"]: + session_data["config"]["peer-as"] = session["node"]["remote_as"]["node"]["asn"]["value"] + + if session["node"]["local_as"]: + session_data["config"]["local-as"] = session["node"]["local_as"]["node"]["asn"]["value"] + + response_payload["openconfig-bgp:neighbors"]["neighbor"].append(session_data) + + return response_payload + + +INFRAHUB_TRANSFORMS = [OCInterfaces, OCBGPNeighbors] diff --git a/backend/tests/integration/git/conftest.py b/backend/tests/integration/git/conftest.py index 91ee404b4b..2984ffa89f 100644 --- a/backend/tests/integration/git/conftest.py +++ b/backend/tests/integration/git/conftest.py @@ -1,8 +1,9 @@ import os -import tarfile +import shutil from typing import Dict import pytest +from git.repo import Repo import infrahub.config as config @@ -33,11 +34,12 @@ def git_upstream_repo_10(helper, git_sources_dir) -> Dict[str, str]: name = "infrahub-demo-edge" fixtures_dir = helper.get_fixtures_dir() - fixture_repo = os.path.join(fixtures_dir, "infrahub-demo-edge-cff6665.tar.gz") - # Extract the fixture package in the source directory - file = tarfile.open(fixture_repo) - file.extractall(git_sources_dir) - file.close() + test_base = os.path.join(fixtures_dir, f"repos/{name}") + shutil.copytree(test_base, f"{git_sources_dir}/{name}") + origin = Repo.init(f"{git_sources_dir}/{name}", initial_branch="main") + for untracked in origin.untracked_files: + origin.index.add(untracked) + origin.index.commit("First commit") return dict(name=name, path=str(os.path.join(git_sources_dir, name))) diff --git a/backend/tests/integration/git/test_git_repository.py b/backend/tests/integration/git/test_git_repository.py index 5dd704d13b..f4bd4f8752 100644 --- a/backend/tests/integration/git/test_git_repository.py +++ b/backend/tests/integration/git/test_git_repository.py @@ -160,7 +160,10 @@ async def test_import_all_python_files( self, db: InfrahubDatabase, client: InfrahubClient, repo: InfrahubRepository, query_99 ): commit = repo.get_commit_value(branch_name="main") - await repo.import_all_python_files(branch_name="main", commit=commit) + config_file = await repo.get_repository_config(branch_name="main", commit=commit) + assert config_file + + await repo.import_all_python_files(branch_name="main", commit=commit, config_file=config_file) check_definitions = await client.all(kind="CoreCheckDefinition") assert len(check_definitions) >= 1 @@ -170,7 +173,7 @@ async def test_import_all_python_files( # Validate if the function is idempotent, another import just after the first one shouldn't change anything nbr_relationships_before = await count_relationships(db=db) - await repo.import_all_python_files(branch_name="main", commit=commit) + await repo.import_all_python_files(branch_name="main", commit=commit, config_file=config_file) assert await count_relationships(db=db) == nbr_relationships_before # 1. Modify an object to validate if its being properly updated @@ -213,7 +216,7 @@ async def test_import_all_python_files( ) await obj2.save(db=db) - await repo.import_all_python_files(branch_name="main", commit=commit) + await repo.import_all_python_files(branch_name="main", commit=commit, config_file=config_file) modified_check0 = await client.get(kind="CoreCheckDefinition", id=check_definitions[0].id) assert modified_check0.timeout.value == check_timeout_value_before_change diff --git a/python_sdk/infrahub_sdk/schema.py b/python_sdk/infrahub_sdk/schema.py index 30b1b6a7c2..e16628744e 100644 --- a/python_sdk/infrahub_sdk/schema.py +++ b/python_sdk/infrahub_sdk/schema.py @@ -37,9 +37,14 @@ class InfrahubRepositoryRFileConfig(pydantic.BaseModel): template_path: Path +class InfrahubCheckDefinitionConfig(pydantic.BaseModel): + file_path: Path = pydantic.Field(..., description="The file within the repo with the check code.") + + class InfrahubRepositoryConfig(pydantic.BaseModel): + check_definitions: List[InfrahubCheckDefinitionConfig] = pydantic.Field(default_factory=list) schemas: List[Path] = pydantic.Field(default_factory=list) - rfiles: Optional[List[InfrahubRepositoryRFileConfig]] = pydantic.Field(default_factory=list) + rfiles: List[InfrahubRepositoryRFileConfig] = pydantic.Field(default_factory=list) # --------------------------------------------------------------------------------- From 6dd54e144f18ae82c5ab1c8929924ab5e8191887 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Fri, 17 Nov 2023 11:19:31 +0100 Subject: [PATCH 088/446] Replace repo for e2e tests and change object import order The python files with transforms need to be imported before the artifact_definitions can be imported. --- backend/infrahub/git/repository.py | 2 +- tasks/demo.py | 14 +++++++++++--- 2 files changed, 12 insertions(+), 4 deletions(-) diff --git a/backend/infrahub/git/repository.py b/backend/infrahub/git/repository.py index 3e878441e7..6244dd47b1 100644 --- a/backend/infrahub/git/repository.py +++ b/backend/infrahub/git/repository.py @@ -1024,10 +1024,10 @@ async def import_objects_from_files(self, branch_name: str, commit: Optional[str await self.import_schema_files(branch_name=branch_name, commit=commit) await self.import_all_graphql_query(branch_name=branch_name, commit=commit) - await self.import_all_yaml_files(branch_name=branch_name, commit=commit) config_file = await self.get_repository_config(branch_name=branch_name, commit=commit) if config_file: await self.import_all_python_files(branch_name=branch_name, commit=commit, config_file=config_file) + await self.import_all_yaml_files(branch_name=branch_name, commit=commit) async def import_objects_rfiles(self, branch_name: str, commit: str, data: List[dict]): LOGGER.debug(f"{self.name} | Importing all RFiles in branch {branch_name} ({commit}) ") diff --git a/tasks/demo.py b/tasks/demo.py index f2420dcd0c..a1046a720e 100644 --- a/tasks/demo.py +++ b/tasks/demo.py @@ -192,17 +192,25 @@ def load_infra_data(context: Context, database: str = INFRAHUB_DATABASE): @task(optional=["database"]) def infra_git_import(context: Context, database: str = INFRAHUB_DATABASE): """Load some demo data.""" - PACKAGE_NAME = "infrahub-demo-edge-cff6665.tar.gz" + REPO_NAME = "infrahub-demo-edge" with context.cd(ESCAPED_REPO_PATH): compose_files_cmd = build_compose_files_cmd(database=database) base_cmd = f"{get_env_vars(context)} docker compose {compose_files_cmd} -p {BUILD_NAME}" execute_command( context=context, - command=f"{base_cmd} cp backend/tests/fixtures/{PACKAGE_NAME} infrahub-git:/remote/infrahub-demo-edge-develop.tar.gz", + command=f"{base_cmd} run infrahub-git cp -r backend/tests/fixtures/repos/{REPO_NAME} /remote/", ) execute_command( context=context, - command=f"{base_cmd} exec --workdir /remote infrahub-git tar -xvzf infrahub-demo-edge-develop.tar.gz", + command=f"{base_cmd} exec --workdir /remote/{REPO_NAME} infrahub-git git init --initial-branch main", + ) + execute_command( + context=context, + command=f"{base_cmd} exec --workdir /remote/{REPO_NAME} infrahub-git git add .", + ) + execute_command( + context=context, + command=f"{base_cmd} exec --workdir /remote/{REPO_NAME} infrahub-git git commit -m first", ) From 1bd77193e9edb27d14f6eb39c7e36bc387031e94 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Tue, 14 Nov 2023 17:26:36 +0100 Subject: [PATCH 089/446] Rework imports from git repositories --- backend/infrahub/git/__init__.py | 2 - backend/infrahub/git/repository.py | 138 ++++++------------ .../integration/git/test_git_repository.py | 12 +- python_sdk/infrahub_sdk/schema.py | 35 ++++- 4 files changed, 82 insertions(+), 105 deletions(-) diff --git a/backend/infrahub/git/__init__.py b/backend/infrahub/git/__init__.py index a5f7825be6..f701575655 100644 --- a/backend/infrahub/git/__init__.py +++ b/backend/infrahub/git/__init__.py @@ -7,7 +7,6 @@ GraphQLQueryInformation, InfrahubRepository, RepoFileInformation, - RFileInformation, TransformPythonInformation, Worktree, extract_repo_file_information, @@ -20,7 +19,6 @@ "TEMPORARY_DIRECTORY_NAME", "ArtifactGenerateResult", "InfrahubRepository", - "RFileInformation", "TransformPythonInformation", "CheckDefinitionInformation", "RepoFileInformation", diff --git a/backend/infrahub/git/repository.py b/backend/infrahub/git/repository.py index 6244dd47b1..e4b8ba1802 100644 --- a/backend/infrahub/git/repository.py +++ b/backend/infrahub/git/repository.py @@ -44,6 +44,7 @@ if TYPE_CHECKING: from infrahub_sdk.branch import BranchData + from infrahub_sdk.schema import InfrahubRepositoryArtifactDefinitionConfig, InfrahubRepositoryRFileConfig from infrahub.message_bus import messages # pylint: disable=too-few-public-methods,too-many-lines @@ -93,34 +94,6 @@ class GraphQLQueryInformation(BaseModel): """Query in string format""" -class RFileInformation(BaseModel): - name: str - """Name of the RFile""" - - description: Optional[str] - """Description of the RFile""" - - query: str - """ID or name of the GraphQL Query associated with this RFile""" - - repository: str = "self" - """ID of the associated repository or self""" - - template_path: str - """Path to the template file within the repo""" - - -class ArtifactDefinitionInformation(BaseModel): - name: str - """Name of the Artifact Definition""" - - artifact_name: Optional[str] - parameters: dict - content_type: str - targets: str - transformation: str - - class CheckDefinitionInformation(BaseModel): name: str """Name of the check""" @@ -1022,14 +995,21 @@ async def import_objects_from_files(self, branch_name: str, commit: Optional[str if not commit: commit = self.get_commit_value(branch_name=branch_name) - await self.import_schema_files(branch_name=branch_name, commit=commit) - await self.import_all_graphql_query(branch_name=branch_name, commit=commit) config_file = await self.get_repository_config(branch_name=branch_name, commit=commit) + + if config_file: + await self.import_schema_files(branch_name=branch_name, commit=commit, config_file=config_file) + + await self.import_all_graphql_query(branch_name=branch_name, commit=commit) + if config_file: await self.import_all_python_files(branch_name=branch_name, commit=commit, config_file=config_file) - await self.import_all_yaml_files(branch_name=branch_name, commit=commit) - async def import_objects_rfiles(self, branch_name: str, commit: str, data: List[dict]): + if config_file: + await self.import_rfiles(branch_name=branch_name, commit=commit, config_file=config_file) + await self.import_artifact_definitions(branch_name=branch_name, commit=commit, config_file=config_file) + + async def import_rfiles(self, branch_name: str, commit: str, config_file: InfrahubRepositoryConfig): LOGGER.debug(f"{self.name} | Importing all RFiles in branch {branch_name} ({commit}) ") schema = await self.client.schema.get(kind="CoreRFile", branch=branch_name) @@ -1042,10 +1022,9 @@ async def import_objects_rfiles(self, branch_name: str, commit: str, data: List[ local_rfiles = {} # Process the list of local RFile to organize them by name - for rfile in data: + for rfile in config_file.rfiles: try: - item = RFileInformation(**rfile) - self.client.schema.validate_data_against_schema(schema=schema, data=rfile) + self.client.schema.validate_data_against_schema(schema=schema, data=rfile.dict(exclude_none=True)) except PydanticValidationError as exc: for error in exc.errors(): LOGGER.error(f" {'/'.join(error['loc'])} | {error['msg']} ({error['type']})") @@ -1055,16 +1034,16 @@ async def import_objects_rfiles(self, branch_name: str, commit: str, data: List[ continue # Insert the ID of the current repository if required - if item.repository == "self": - item.repository = self.id + if rfile.repository == "self": + rfile.repository = self.id # Query the GraphQL query and (eventually) replace the name with the ID graphql_query = await self.client.get( - kind="CoreGraphQLQuery", branch=branch_name, id=str(item.query), populate_store=True + kind="CoreGraphQLQuery", branch=branch_name, id=str(rfile.query), populate_store=True ) - item.query = graphql_query.id + rfile.query = graphql_query.id - local_rfiles[item.name] = item + local_rfiles[rfile.name] = rfile present_in_both, only_graph, only_local = compare_lists( list1=list(rfiles_in_graph.keys()), list2=list(local_rfiles.keys()) @@ -1089,17 +1068,17 @@ async def import_objects_rfiles(self, branch_name: str, commit: str, data: List[ LOGGER.info(f"{self.name} | RFile '{rfile_name}' not found locally in branch {branch_name}, deleting") await rfiles_in_graph[rfile_name].delete() - async def create_rfile(self, branch_name: str, data: RFileInformation) -> InfrahubNode: + async def create_rfile(self, branch_name: str, data: InfrahubRepositoryRFileConfig) -> InfrahubNode: schema = await self.client.schema.get(kind="CoreRFile", branch=branch_name) create_payload = self.client.schema.generate_payload_create( - schema=schema, data=data.dict(), source=self.id, is_protected=True + schema=schema, data=data.payload, source=self.id, is_protected=True ) obj = await self.client.create(kind="CoreRFile", branch=branch_name, **create_payload) await obj.save() return obj @classmethod - async def compare_rfile(cls, existing_rfile: InfrahubNode, local_rfile: RFileInformation) -> bool: + async def compare_rfile(cls, existing_rfile: InfrahubNode, local_rfile: InfrahubRepositoryRFileConfig) -> bool: # pylint: disable=no-member if ( existing_rfile.description.value != local_rfile.description @@ -1110,7 +1089,7 @@ async def compare_rfile(cls, existing_rfile: InfrahubNode, local_rfile: RFileInf return True - async def update_rfile(self, existing_rfile: InfrahubNode, local_rfile: RFileInformation) -> None: + async def update_rfile(self, existing_rfile: InfrahubNode, local_rfile: InfrahubRepositoryRFileConfig) -> None: # pylint: disable=no-member if existing_rfile.description.value != local_rfile.description: existing_rfile.description.value = local_rfile.description @@ -1118,12 +1097,12 @@ async def update_rfile(self, existing_rfile: InfrahubNode, local_rfile: RFileInf if existing_rfile.query.id != local_rfile.query: existing_rfile.query = {"id": local_rfile.query, "source": str(self.id), "is_protected": True} - if existing_rfile.template_path.value != local_rfile.template_path: - existing_rfile.template_path.value = local_rfile.template_path + if existing_rfile.template_path.value != local_rfile.template_path_value: + existing_rfile.template_path.value = local_rfile.template_path_value await existing_rfile.save() - async def import_objects_artifact_definitions(self, branch_name: str, commit: str, data: List[dict]): + async def import_artifact_definitions(self, branch_name: str, commit: str, config_file: InfrahubRepositoryConfig): LOGGER.debug(f"{self.name} | Importing all Artifact Definitions in branch {branch_name} ({commit}) ") schema = await self.client.schema.get(kind="CoreArtifactDefinition", branch=branch_name) @@ -1133,13 +1112,12 @@ async def import_objects_artifact_definitions(self, branch_name: str, commit: st for artdef in await self.client.filters(kind="CoreArtifactDefinition", branch=branch_name) } - local_artifact_defs = {} + local_artifact_defs: Dict[str, InfrahubRepositoryArtifactDefinitionConfig] = {} # Process the list of local RFile to organize them by name - for artdef in data: + for artdef in config_file.artifact_definitions: try: - item = ArtifactDefinitionInformation(**artdef) - self.client.schema.validate_data_against_schema(schema=schema, data=artdef) + self.client.schema.validate_data_against_schema(schema=schema, data=artdef.dict(exclude_none=True)) except PydanticValidationError as exc: for error in exc.errors(): LOGGER.error(f" {'/'.join(error['loc'])} | {error['msg']} ({error['type']})") @@ -1148,7 +1126,7 @@ async def import_objects_artifact_definitions(self, branch_name: str, commit: st LOGGER.error(exc.message) continue - local_artifact_defs[item.name] = item + local_artifact_defs[artdef.name] = artdef present_in_both, _, only_local = compare_lists( list1=list(artifact_defs_in_graph.keys()), list2=list(local_artifact_defs.keys()) @@ -1173,7 +1151,9 @@ async def import_objects_artifact_definitions(self, branch_name: str, commit: st local_artifact_definition=local_artifact_defs[artdef_name], ) - async def create_artifact_definition(self, branch_name: str, data: ArtifactDefinitionInformation) -> InfrahubNode: + async def create_artifact_definition( + self, branch_name: str, data: InfrahubRepositoryArtifactDefinitionConfig + ) -> InfrahubNode: schema = await self.client.schema.get(kind="CoreArtifactDefinition", branch=branch_name) create_payload = self.client.schema.generate_payload_create( schema=schema, data=data.dict(), source=self.id, is_protected=True @@ -1184,7 +1164,9 @@ async def create_artifact_definition(self, branch_name: str, data: ArtifactDefin @classmethod async def compare_artifact_definition( - cls, existing_artifact_definition: InfrahubNode, local_artifact_definition: RFileInformation + cls, + existing_artifact_definition: InfrahubNode, + local_artifact_definition: InfrahubRepositoryArtifactDefinitionConfig, ) -> bool: # pylint: disable=no-member if ( @@ -1194,8 +1176,12 @@ async def compare_artifact_definition( ): return False + return True + async def update_artifact_definition( - self, existing_artifact_definition: InfrahubNode, local_artifact_definition: RFileInformation + self, + existing_artifact_definition: InfrahubNode, + local_artifact_definition: InfrahubRepositoryArtifactDefinitionConfig, ) -> None: # pylint: disable=no-member if existing_artifact_definition.artifact_name.value != local_artifact_definition.artifact_name: @@ -1235,14 +1221,10 @@ async def get_repository_config(self, branch_name: str, commit: str) -> Optional ) return - async def import_schema_files(self, branch_name: str, commit: str) -> None: + async def import_schema_files(self, branch_name: str, commit: str, config_file: InfrahubRepositoryConfig) -> None: # pylint: disable=too-many-branches - config_file = await self.get_repository_config(branch_name=branch_name, commit=commit) branch_wt = self.get_worktree(identifier=commit or branch_name) - if not config_file: - return - schemas_data: List[YamlFile] = [] for schema in config_file.schemas: @@ -1631,7 +1613,7 @@ async def create_python_transform(self, branch_name: str, transform: TransformPy async def update_python_transform( self, existing_transform: InfrahubNode, local_transform: TransformPythonInformation - ) -> bool: + ) -> None: if existing_transform.query.id != local_transform.query: existing_transform.query = {"id": local_transform.query, "source": str(self.id), "is_protected": True} @@ -1663,40 +1645,6 @@ async def compare_python_transform( return False return True - async def import_all_yaml_files(self, branch_name: str, commit: str, exclude: Optional[List[str]] = None): - yaml_files = await self.find_files(extension=["yml", "yaml"], commit=commit) - - for yaml_file in yaml_files: - LOGGER.debug(f"{self.name} | Checking {yaml_file}") - - # ------------------------------------------------------ - # Import Yaml - # ------------------------------------------------------ - with open(yaml_file, "r", encoding="UTF-8") as file_data: - yaml_data = file_data.read() - - try: - data = yaml.safe_load(yaml_data) - except yaml.YAMLError as exc: - LOGGER.warning(f"{self.name} | Unable to load YAML file {yaml_file} : {exc}") - continue - - if not isinstance(data, dict): - LOGGER.debug(f"{self.name} | {yaml_file} : payload is not a dictionnary .. SKIPPING") - continue - - # ------------------------------------------------------ - # Search for Valid object types - # ------------------------------------------------------ - for key, data in data.items(): - if exclude and key in exclude: - continue - if not hasattr(self, f"import_objects_{key}"): - continue - - method = getattr(self, f"import_objects_{key}") - await method(branch_name=branch_name, commit=commit, data=data) - async def import_all_python_files(self, branch_name: str, commit: str, config_file: InfrahubRepositoryConfig): await self.import_python_check_definitions(branch_name=branch_name, commit=commit, config_file=config_file) commit_wt = self.get_worktree(identifier=commit) diff --git a/backend/tests/integration/git/test_git_repository.py b/backend/tests/integration/git/test_git_repository.py index f4bd4f8752..ab34244643 100644 --- a/backend/tests/integration/git/test_git_repository.py +++ b/backend/tests/integration/git/test_git_repository.py @@ -115,7 +115,9 @@ async def repo(self, test_client, client, db: InfrahubDatabase, git_upstream_rep async def test_import_schema_files(self, db: InfrahubDatabase, client: InfrahubClient, repo: InfrahubRepository): commit = repo.get_commit_value(branch_name="main") - await repo.import_schema_files(branch_name="main", commit=commit) + config_file = await repo.get_repository_config(branch_name="main", commit=commit) + assert config_file + await repo.import_schema_files(branch_name="main", commit=commit, config_file=config_file) assert await client.schema.get(kind="DemoEdgeFabric", refresh=True) @@ -239,14 +241,16 @@ async def test_import_all_yaml_files( self, db: InfrahubDatabase, client: InfrahubClient, repo: InfrahubRepository, query_99 ): commit = repo.get_commit_value(branch_name="main") - await repo.import_all_yaml_files(branch_name="main", commit=commit, exclude=["artifact_definitions"]) + config_file = await repo.get_repository_config(branch_name="main", commit=commit) + assert config_file + await repo.import_rfiles(branch_name="main", commit=commit, config_file=config_file) rfiles = await client.all(kind="CoreRFile") assert len(rfiles) == 2 # Validate if the function is idempotent, another import just after the first one shouldn't change anything nbr_relationships_before = await count_relationships(db=db) - await repo.import_all_yaml_files(branch_name="main", commit=commit, exclude=["artifact_definitions"]) + await repo.import_rfiles(branch_name="main", commit=commit, config_file=config_file) assert await count_relationships(db=db) == nbr_relationships_before # 1. Modify an object to validate if its being properly updated @@ -267,7 +271,7 @@ async def test_import_all_yaml_files( ) await obj.save(db=db) - await repo.import_all_yaml_files(branch_name="main", commit=commit, exclude=["artifact_definitions"]) + await repo.import_rfiles(branch_name="main", commit=commit, config_file=config_file) modified_rfile = await client.get(kind="CoreRFile", id=rfiles[0].id) assert modified_rfile.template_path.value == rfile_template_path_value_before_change diff --git a/python_sdk/infrahub_sdk/schema.py b/python_sdk/infrahub_sdk/schema.py index e16628744e..b6e1688871 100644 --- a/python_sdk/infrahub_sdk/schema.py +++ b/python_sdk/infrahub_sdk/schema.py @@ -30,11 +30,37 @@ # --------------------------------------------------------------------------------- # Repository Configuration file # --------------------------------------------------------------------------------- + + +class InfrahubRepositoryArtifactDefinitionConfig(pydantic.BaseModel): + name: str = pydantic.Field(..., description="The name of the artifact definition") + artifact_name: Optional[str] = pydantic.Field( + default=None, description="Name of the artifact created from this definition" + ) + parameters: Dict[str, Any] = pydantic.Field( + ..., description="The input parameters required to render this artifact" + ) + content_type: str = pydantic.Field(..., description="The content type of the rendered artifact") + targets: str = pydantic.Field(..., description="The group to target when creating artifacts") + transformation: str = pydantic.Field(..., description="The transformation to use.") + + class InfrahubRepositoryRFileConfig(pydantic.BaseModel): - name: str - query: str - repository: str - template_path: Path + name: str = pydantic.Field(..., description="The name of the RFile") + query: str = pydantic.Field(..., description="The name of the GraphQL Query") + repository: str = pydantic.Field(default="self", description="The repository") + template_path: Path = pydantic.Field(..., description="The path within the repository of the template file") + description: Optional[str] = pydantic.Field(default=None, description="Description for this rfile") + + @property + def template_path_value(self) -> str: + return self.template_path.as_posix() + + @property + def payload(self) -> Dict[str, str]: + data = self.dict(exclude_none=True) + data["template_path"] = self.template_path_value + return data class InfrahubCheckDefinitionConfig(pydantic.BaseModel): @@ -45,6 +71,7 @@ class InfrahubRepositoryConfig(pydantic.BaseModel): check_definitions: List[InfrahubCheckDefinitionConfig] = pydantic.Field(default_factory=list) schemas: List[Path] = pydantic.Field(default_factory=list) rfiles: List[InfrahubRepositoryRFileConfig] = pydantic.Field(default_factory=list) + artifact_definitions: List[InfrahubRepositoryArtifactDefinitionConfig] = pydantic.Field(default_factory=list) # --------------------------------------------------------------------------------- From c6f51e3dc289519032c0afeb7fff192f4e3a3e4a Mon Sep 17 00:00:00 2001 From: Bilal Date: Mon, 20 Nov 2023 00:42:10 +0100 Subject: [PATCH 090/446] fix tab counters update on change --- .../object-item-details-paginated.tsx | 8 +++++++- .../relationship-details-paginated.tsx | 2 -- .../relationships-details-paginated.tsx | 6 ++++-- frontend/tests/e2e/relationships.cy.ts | 16 +++++++++++++--- 4 files changed, 24 insertions(+), 8 deletions(-) diff --git a/frontend/src/screens/object-item-details/object-item-details-paginated.tsx b/frontend/src/screens/object-item-details/object-item-details-paginated.tsx index 0f5f6e7ac9..f739230c0b 100644 --- a/frontend/src/screens/object-item-details/object-item-details-paginated.tsx +++ b/frontend/src/screens/object-item-details/object-item-details-paginated.tsx @@ -316,7 +316,13 @@ export default function ObjectItemDetails(props: any) {
)} - {qspTab && } + {qspTab && ( + + )} ); @@ -116,7 +118,7 @@ export default function RelationshipsDetails(props: RelationshipsDetailsProps) { parentSchema={parentSchema} relationshipsData={relationships} relationshipSchema={relationshipSchema} - refetch={refetch} + refetch={() => Promise.all([refetch(), refetchObjectDetails()])} onDeleteRelationship={handleDeleteRelationship} /> diff --git a/frontend/tests/e2e/relationships.cy.ts b/frontend/tests/e2e/relationships.cy.ts index 9e085e2bdd..bbba40f9d0 100644 --- a/frontend/tests/e2e/relationships.cy.ts +++ b/frontend/tests/e2e/relationships.cy.ts @@ -10,11 +10,11 @@ describe("Relationship Page", () => { cy.contains("button", "Edit").should("be.disabled"); cy.contains("button", "Manage groups").should("be.disabled"); - cy.contains("Artifacts").click(); + cy.contains("Artifacts2").click(); cy.url().should("include", "tab=artifacts"); cy.get("[data-cy='relationship-row']").should("have.length", 2); - cy.contains("Interfaces").click(); + cy.contains("Interfaces14").click(); cy.url().should("include", "tab=interfaces"); cy.get("[data-cy='relationship-row']").should("have.length", 10); cy.contains("Showing 1 to 10 of 14 results").should("exist"); @@ -39,7 +39,10 @@ describe("Relationship Page", () => { cy.get("[data-cy='select2step-1']").type("Int"); cy.contains("InterfaceL2").click(); - // fill 2md select with click only + // fill 2nd select with click only + // /!\ Is SUPER FLAKY ! + // 1. No easy visual way to differentiate each relationship "Ethernet11" + // 2. Order of "Ethernet11" is not guaranteed to be the same each time cy.get("[data-cy='select2step-2']").should("be.visible"); cy.get("[data-cy='select2step-2'] button").click(); cy.contains("Ethernet11").click(); @@ -49,6 +52,8 @@ describe("Relationship Page", () => { cy.contains("Association with InfraInterface added").should("be.visible"); cy.get("[data-cy='relationship-row']").contains("Ethernet11").should("be.visible"); + cy.contains("Showing 1 to 10 of 15 results").should("exist"); + cy.contains("Interfaces15").should("exist"); }); it("should delete the newly created relationship", () => { @@ -58,6 +63,9 @@ describe("Relationship Page", () => { cy.contains("Interfaces").click(); // get delete button from row containing Ethernet11 + // /!\ Is SUPER FLAKY ! + // 1. No easy visual way to differentiate each relationship "Ethernet11" + // 2. Order of "Ethernet11" is not guaranteed to be the same each time cy.get("[data-cy='relationship-row']") .contains(/^Ethernet11$/) .parent() @@ -75,6 +83,8 @@ describe("Relationship Page", () => { // after delete cy.contains("Item removed from the group").should("be.visible"); + cy.contains("Showing 1 to 10 of 14 results").should("exist"); + cy.contains("Interfaces14").should("exist"); cy.get("[data-cy='modal-delete']").should("not.exist"); cy.get("[data-cy='relationship-row']").should("not.contain", /^Ethernet11$/); }); From 677a486fb0a9ef8bdcb291a378416a878f2d6773 Mon Sep 17 00:00:00 2001 From: Bilal Date: Mon, 20 Nov 2023 03:06:24 +0100 Subject: [PATCH 091/446] fix ci --- frontend/src/components-form/input.tsx | 3 ++- .../tutorial/tutorial-1-branch-and-version-control.cy.ts | 9 ++------- 2 files changed, 4 insertions(+), 8 deletions(-) diff --git a/frontend/src/components-form/input.tsx b/frontend/src/components-form/input.tsx index 2e3b6a3c62..a49a169d99 100644 --- a/frontend/src/components-form/input.tsx +++ b/frontend/src/components-form/input.tsx @@ -21,12 +21,13 @@ export const OpsInput = (props: OpsInputProps) => { return ( <>
-
{ cy.get("[data-cy='create']").click(); // Add organization name - cy.get(".grid > :nth-child(1) > .relative > .block").type(ORGANIZATION_NAME, { - delay: 0, - force: true, - }); - cy.get(".grid > :nth-child(3) > .relative > .block").type(ORGANIZATION_DESCRIPTION, { - delay: 0, - }); + cy.get("#Name").type(ORGANIZATION_NAME); + cy.get("#Description").type(ORGANIZATION_DESCRIPTION); if (this.screenshots) { cy.screenshot("tutorial_1_organization_create", screenshotConfig); From 8ec7396f8dbac5157cd13ac77c93dc408c00f9f8 Mon Sep 17 00:00:00 2001 From: pa-lem Date: Mon, 20 Nov 2023 08:21:01 +0100 Subject: [PATCH 092/446] revert cypress config and fix package lock --- frontend/cypress.config.ts | 2 +- frontend/package-lock.json | 17 ----------------- 2 files changed, 1 insertion(+), 18 deletions(-) diff --git a/frontend/cypress.config.ts b/frontend/cypress.config.ts index 18d474390f..7c16c3d51b 100644 --- a/frontend/cypress.config.ts +++ b/frontend/cypress.config.ts @@ -30,7 +30,7 @@ export default defineConfig({ // setupNodeEvents(on) { // on("file:preprocessor", vitePreprocessor()); // }, - baseUrl: "http://localhost:8000/", + baseUrl: "http://localhost:8080/", specPattern: "tests/e2e/**/*.cy.{js,jsx,ts,tsx}", reporter: "spec", video: false, diff --git a/frontend/package-lock.json b/frontend/package-lock.json index 877c4293ae..c281ceed54 100644 --- a/frontend/package-lock.json +++ b/frontend/package-lock.json @@ -13,7 +13,6 @@ "@heroicons/react": "^2.0.15", "@hookform/error-message": "^2.0.1", "@iconify-icon/react": "^1.0.8", - "@iconify-icons/mdi": "^1.2.48", "@iconify-json/mdi": "^1.1.55", "@sentry/react": "^7.45.0", "@sentry/tracing": "^7.45.0", @@ -2411,14 +2410,6 @@ "react": ">=16" } }, - "node_modules/@iconify-icons/mdi": { - "version": "1.2.48", - "resolved": "https://registry.npmjs.org/@iconify-icons/mdi/-/mdi-1.2.48.tgz", - "integrity": "sha512-51bfNoRLhYDfxSu0Nyi/uRVq6q/tP4TyEc0vvuNwImrXpxrRJUAWJF2A36CfBkXm3hO9IBlph/CD/XNDJKgG6w==", - "dependencies": { - "@iconify/types": "*" - } - }, "node_modules/@iconify-json/mdi": { "version": "1.1.55", "resolved": "https://registry.npmjs.org/@iconify-json/mdi/-/mdi-1.1.55.tgz", @@ -14323,14 +14314,6 @@ "iconify-icon": "^1.0.8" } }, - "@iconify-icons/mdi": { - "version": "1.2.48", - "resolved": "https://registry.npmjs.org/@iconify-icons/mdi/-/mdi-1.2.48.tgz", - "integrity": "sha512-51bfNoRLhYDfxSu0Nyi/uRVq6q/tP4TyEc0vvuNwImrXpxrRJUAWJF2A36CfBkXm3hO9IBlph/CD/XNDJKgG6w==", - "requires": { - "@iconify/types": "*" - } - }, "@iconify-json/mdi": { "version": "1.1.55", "resolved": "https://registry.npmjs.org/@iconify-json/mdi/-/mdi-1.1.55.tgz", From 7bfd72c803ce30811e9426d22809eb90d4259f38 Mon Sep 17 00:00:00 2001 From: pa-lem Date: Mon, 20 Nov 2023 08:31:18 +0100 Subject: [PATCH 093/446] remove videos for integration tests --- frontend/cypress.config.ts | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/frontend/cypress.config.ts b/frontend/cypress.config.ts index 7c16c3d51b..f0fb98d4d1 100644 --- a/frontend/cypress.config.ts +++ b/frontend/cypress.config.ts @@ -22,7 +22,7 @@ export default defineConfig({ }, specPattern: "tests/integrations/**/*.cy.{js,jsx,ts,tsx}", reporter: "spec", - video: true, + video: false, viewportHeight: 720, viewportWidth: 1280, }, From 5b12bb8bb11055ce7fd5f0e6c7799dd3f54e9394 Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Mon, 20 Nov 2023 08:38:12 +0100 Subject: [PATCH 094/446] Fix AttributeGetQuery --- backend/infrahub/core/query/attribute.py | 20 ++++++++++--------- .../tests/unit/core/test_attribute_query.py | 15 ++++++++++++++ 2 files changed, 26 insertions(+), 9 deletions(-) create mode 100644 backend/tests/unit/core/test_attribute_query.py diff --git a/backend/infrahub/core/query/attribute.py b/backend/infrahub/core/query/attribute.py index 14fbd40a54..45b362f352 100644 --- a/backend/infrahub/core/query/attribute.py +++ b/backend/infrahub/core/query/attribute.py @@ -202,16 +202,18 @@ async def query_init(self, db: InfrahubDatabase, *args, **kwargs): at = self.at or self.attr.at self.params["at"] = at.to_string() - rels_filter, rel_params = self.branch.get_query_filter_relationships(rel_labels=["r1", "r2"], at=at.to_string()) - self.params.update(rel_params) + rels_filter, rels_params = self.branch.get_query_filter_path(at=at.to_string()) + self.params.update(rels_params) - query = """ - MATCH (n { uuid: $node_uuid }) - MATCH (a { uuid: $attr_uuid }) - MATCH (n)-[r1]-(a)-[r2:HAS_VALUE|IS_VISIBLE|IS_PROTECTED|HAS_SOURCE|HAS_OWNER]-(ap) - WHERE %s - """ % ("\n AND ".join(rels_filter),) + query = ( + """ + MATCH (a:Attribute { uuid: $attr_uuid }) + MATCH p = ((a)-[r2:HAS_VALUE|IS_VISIBLE|IS_PROTECTED|HAS_SOURCE|HAS_OWNER]->(ap)) + WHERE all(r IN relationships(p) WHERE ( %s)) + """ + % rels_filter + ) self.add_to_query(query) - self.return_labels = ["n", "a", "ap", "r1", "r2"] + self.return_labels = ["a", "ap", "r2"] diff --git a/backend/tests/unit/core/test_attribute_query.py b/backend/tests/unit/core/test_attribute_query.py new file mode 100644 index 0000000000..c5d9043d48 --- /dev/null +++ b/backend/tests/unit/core/test_attribute_query.py @@ -0,0 +1,15 @@ +from infrahub.core.branch import Branch +from infrahub.core.node import Node +from infrahub.core.query.attribute import AttributeGetQuery +from infrahub.database import InfrahubDatabase + + +async def test_AttributeGetQuery(db: InfrahubDatabase, default_branch: Branch, car_person_schema): + obj = await Node.init(db=db, schema="TestPerson", branch=default_branch) + await obj.new(db=db, name="John", height=180) + await obj.save(db=db) + + query = await AttributeGetQuery.init(db=db, attr=obj.name) + await query.execute(db=db) + + assert query.num_of_results == 3 From bc2139b575948947853976f25827f35c3f5804ff Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Mon, 20 Nov 2023 08:38:27 +0100 Subject: [PATCH 095/446] Add unit tests for NodeDeleteQuery --- backend/tests/unit/core/test_node_query.py | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) diff --git a/backend/tests/unit/core/test_node_query.py b/backend/tests/unit/core/test_node_query.py index 91ec5e9d91..77c8d87bf9 100644 --- a/backend/tests/unit/core/test_node_query.py +++ b/backend/tests/unit/core/test_node_query.py @@ -6,6 +6,7 @@ from infrahub.core.node import Node from infrahub.core.query.node import ( NodeCreateAllQuery, + NodeDeleteQuery, NodeGetListQuery, NodeListGetAttributeQuery, NodeListGetInfoQuery, @@ -348,3 +349,18 @@ async def test_query_NodeListGetRelationshipsQuery(db: InfrahubDatabase, default assert person_jack_tags_main.id in result assert "builtintag__testperson" in result[person_jack_tags_main.id] assert len(result[person_jack_tags_main.id]["builtintag__testperson"]) == 2 + + +async def test_query_NodeDeleteQuery( + db: InfrahubDatabase, + default_branch: Branch, + person_jack_tags_main: Node, + tag_blue_main: Node, +): + tags_before = await NodeManager.query(db=db, schema="BuiltinTag", branch=default_branch) + + query = await NodeDeleteQuery.init(db=db, node=tag_blue_main, branch=default_branch) + await query.execute(db=db) + + tags_after = await NodeManager.query(db=db, schema="BuiltinTag", branch=default_branch) + assert len(tags_after) == len(tags_before) - 1 From 4ff15859b946b8a587449596cf64365c2b085e5a Mon Sep 17 00:00:00 2001 From: Damien Garros Date: Mon, 20 Nov 2023 08:39:18 +0100 Subject: [PATCH 096/446] get_many: Limit NodeListGetAttributeQuery to list of nodes returned by NodeListGetInfoQuery --- backend/infrahub/core/manager.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/backend/infrahub/core/manager.py b/backend/infrahub/core/manager.py index 2a542d990a..6e7d5dcbeb 100644 --- a/backend/infrahub/core/manager.py +++ b/backend/infrahub/core/manager.py @@ -331,7 +331,7 @@ async def get_many( # pylint: disable=too-many-branches # Query list of all Attributes query = await NodeListGetAttributeQuery.init( db=db, - ids=ids, + ids=list(nodes_info_by_id.keys()), fields=fields, branch=branch, include_source=include_source, From 4b4e5f8b19e3da475a8145d8c17b6a4b529789e4 Mon Sep 17 00:00:00 2001 From: pa-lem Date: Mon, 20 Nov 2023 08:40:59 +0100 Subject: [PATCH 097/446] fix cypres config --- frontend/cypress.config.ts | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/frontend/cypress.config.ts b/frontend/cypress.config.ts index f0fb98d4d1..984b8b2420 100644 --- a/frontend/cypress.config.ts +++ b/frontend/cypress.config.ts @@ -30,7 +30,7 @@ export default defineConfig({ // setupNodeEvents(on) { // on("file:preprocessor", vitePreprocessor()); // }, - baseUrl: "http://localhost:8080/", + baseUrl: "http://localhost:8000/", specPattern: "tests/e2e/**/*.cy.{js,jsx,ts,tsx}", reporter: "spec", video: false, From aea2eadc998d217c4885a59993de6d7b13ba6b4e Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Mon, 20 Nov 2023 09:08:29 +0100 Subject: [PATCH 098/446] Fix for Pydantic 2 --- python_sdk/infrahub_sdk/schema.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/python_sdk/infrahub_sdk/schema.py b/python_sdk/infrahub_sdk/schema.py index b6e1688871..e6329ffdb5 100644 --- a/python_sdk/infrahub_sdk/schema.py +++ b/python_sdk/infrahub_sdk/schema.py @@ -54,7 +54,7 @@ class InfrahubRepositoryRFileConfig(pydantic.BaseModel): @property def template_path_value(self) -> str: - return self.template_path.as_posix() + return str(self.template_path) @property def payload(self) -> Dict[str, str]: From 40345654aa087d408cbc1292ff87921f74157825 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Mon, 20 Nov 2023 09:18:43 +0100 Subject: [PATCH 099/446] Update labels.yml --- .github/labels.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/labels.yml b/.github/labels.yml index c00d9c1c6f..293e41ead9 100644 --- a/.github/labels.yml +++ b/.github/labels.yml @@ -142,7 +142,7 @@ description: "The redaction of the issue is still a work in progress" color: "dcb518" -- name: "state/referenced" +- name: "state/ref" description: "This issue is referenced in our internal tooling" color: "c9510c" From b7678b3298db2ef0a3fef1a0a280e2cf884e6ea4 Mon Sep 17 00:00:00 2001 From: Bilal Date: Mon, 20 Nov 2023 10:13:58 +0100 Subject: [PATCH 100/446] added test for update relationship --- .../relationships-details-paginated.tsx | 9 ++++-- frontend/tests/e2e/relationships.cy.ts | 29 ++++++++++++++++++- 2 files changed, 34 insertions(+), 4 deletions(-) diff --git a/frontend/src/screens/object-item-details/relationships-details-paginated.tsx b/frontend/src/screens/object-item-details/relationships-details-paginated.tsx index aec07f5850..114c8d9b17 100644 --- a/frontend/src/screens/object-item-details/relationships-details-paginated.tsx +++ b/frontend/src/screens/object-item-details/relationships-details-paginated.tsx @@ -66,6 +66,10 @@ export default function RelationshipsDetails(props: RelationshipsDetailsProps) { const { loading, error, data, refetch } = useQuery(query, { skip: !relationshipTab }); + const updatePageData = () => { + return Promise.all([refetch(), refetchObjectDetails()]); + }; + if (loading) { return ; } @@ -104,8 +108,7 @@ export default function RelationshipsDetails(props: RelationshipsDetailsProps) { context: { branch: branch?.name, date }, }); - refetchObjectDetails(); - refetch(); + updatePageData(); toast(); }; @@ -118,7 +121,7 @@ export default function RelationshipsDetails(props: RelationshipsDetailsProps) { parentSchema={parentSchema} relationshipsData={relationships} relationshipSchema={relationshipSchema} - refetch={() => Promise.all([refetch(), refetchObjectDetails()])} + refetch={updatePageData} onDeleteRelationship={handleDeleteRelationship} /> diff --git a/frontend/tests/e2e/relationships.cy.ts b/frontend/tests/e2e/relationships.cy.ts index bbba40f9d0..fc3ae52b86 100644 --- a/frontend/tests/e2e/relationships.cy.ts +++ b/frontend/tests/e2e/relationships.cy.ts @@ -56,7 +56,7 @@ describe("Relationship Page", () => { cy.contains("Interfaces15").should("exist"); }); - it("should delete the newly created relationship", () => { + it("should update the new relationship", () => { cy.login(ADMIN_CREDENTIALS.username, ADMIN_CREDENTIALS.password); cy.visit("/objects/InfraDevice"); cy.contains("atl1-edge1").click(); @@ -69,6 +69,33 @@ describe("Relationship Page", () => { cy.get("[data-cy='relationship-row']") .contains(/^Ethernet11$/) .parent() + .within(() => { + cy.get("[data-cy='metadata-edit-button']").click(); + }); + + cy.get("[data-cy='form']").within(() => { + cy.get("#Description").type("Test description"); + cy.contains("Save").click(); + }); + + cy.contains("InterfaceL2 updated").should("be.visible"); + cy.get("[data-cy='relationship-row']") + .contains(/^Ethernet11$/) + .parent() + .within(() => { + cy.contains("Test description").should("be.visible"); + }); + }); + + it("should delete the newly created relationship", () => { + cy.login(ADMIN_CREDENTIALS.username, ADMIN_CREDENTIALS.password); + cy.visit("/objects/InfraDevice"); + cy.contains("atl1-edge1").click(); + cy.contains("Interfaces").click(); + + cy.get("[data-cy='relationship-row']") + .contains(/^Test description$/) + .parent() .within(() => { cy.get("[data-cy='relationship-delete-button']").click(); }); From 0b6d539c1ea11001b3031453447082ae00296af4 Mon Sep 17 00:00:00 2001 From: pa-lem Date: Mon, 20 Nov 2023 10:34:08 +0100 Subject: [PATCH 101/446] revert comment --- frontend/src/screens/layout/header.tsx | 15 ++++++++------- 1 file changed, 8 insertions(+), 7 deletions(-) diff --git a/frontend/src/screens/layout/header.tsx b/frontend/src/screens/layout/header.tsx index d857cde971..11a495d9d0 100644 --- a/frontend/src/screens/layout/header.tsx +++ b/frontend/src/screens/layout/header.tsx @@ -20,6 +20,7 @@ import { dateVar } from "../../graphql/variables/dateVar"; import useQuery from "../../hooks/useQuery"; import { schemaState } from "../../state/atoms/schema.atom"; import { classNames, debounce, parseJwt } from "../../utils/common"; +import LoadingScreen from "../loading-screen/loading-screen"; import { userNavigation } from "./navigation-list"; interface Props { @@ -90,13 +91,13 @@ export default function Header(props: Props) { setQspDate(undefined); }; - // if (loading || !schema) { - // return ( - //
- // - //
- // ); - // } + if (loading || !schema) { + return ( +
+ +
+ ); + } const profile = data?.AccountProfile; From 6dd5d2924d327c2ec5f2f4bf173482bdb38d3d84 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Mon, 20 Nov 2023 11:06:27 +0100 Subject: [PATCH 102/446] Fix issue with multiple transforms Also moves transforms to .infrahub.yml Fixes #1378 --- backend/infrahub/git/repository.py | 198 ++++++++++-------- .../repos/infrahub-demo-edge/.infrahub.yml | 3 + python_sdk/infrahub_sdk/schema.py | 5 + 3 files changed, 116 insertions(+), 90 deletions(-) diff --git a/backend/infrahub/git/repository.py b/backend/infrahub/git/repository.py index e4b8ba1802..343ecd5e76 100644 --- a/backend/infrahub/git/repository.py +++ b/backend/infrahub/git/repository.py @@ -1004,8 +1004,6 @@ async def import_objects_from_files(self, branch_name: str, commit: Optional[str if config_file: await self.import_all_python_files(branch_name=branch_name, commit=commit, config_file=config_file) - - if config_file: await self.import_rfiles(branch_name=branch_name, commit=commit, config_file=config_file) await self.import_artifact_definitions(branch_name=branch_name, commit=commit, config_file=config_file) @@ -1428,6 +1426,80 @@ async def import_python_check_definitions( ) await check_definition_in_graph[check_name].delete() + async def import_python_transforms( + self, branch_name: str, commit: str, config_file: InfrahubRepositoryConfig + ) -> None: + commit_wt = self.get_worktree(identifier=commit) + branch_wt = self.get_worktree(identifier=commit or branch_name) + + # Ensure the path for this repository is present in sys.path + if self.directory_root not in sys.path: + sys.path.append(self.directory_root) + + transforms = [] + for transform in config_file.python_transforms: + LOGGER.debug(self.name, import_type="python_transform", file=transform.file_path) + + file_info = extract_repo_file_information( + full_filename=os.path.join(branch_wt.directory, transform.file_path.as_posix()), + repo_directory=self.directory_root, + worktree_directory=commit_wt.directory, + ) + try: + module = importlib.import_module(file_info.module_name) + except ModuleNotFoundError as exc: + LOGGER.warning( + self.name, import_type="python_transform", file=transform.file_path.as_posix(), error=str(exc) + ) + continue + + transforms.extend( + await self.get_python_transforms( + branch_name=branch_name, + module=module, + file_path=file_info.relative_path_file, + ) + ) + + local_transform_definitions = {transform.name: transform for transform in transforms} + transform_definition_in_graph = { + transform.name.value: transform + for transform in await self.client.filters( + kind="CoreTransformPython", branch=branch_name, repository__ids=[str(self.id)] + ) + } + + present_in_both, only_graph, only_local = compare_lists( + list1=list(transform_definition_in_graph.keys()), list2=list(local_transform_definitions.keys()) + ) + + for transform_name in only_local: + LOGGER.info( + f"{self.name} | New TransformPython '{transform_name}' found on branch {branch_name} ({commit[:8]}), creating" + ) + await self.create_python_transform( + branch_name=branch_name, transform=local_transform_definitions[transform_name] + ) + + for transform_name in present_in_both: + if not await self.compare_python_transform( + local_transform=local_transform_definitions[transform_name], + existing_transform=transform_definition_in_graph[transform_name], + ): + LOGGER.info( + f"{self.name} | New version of TransformPython '{transform_name}' found on branch {branch_name} ({commit[:8]}), updating" + ) + await self.update_python_transform( + local_transform=local_transform_definitions[transform_name], + existing_transform=transform_definition_in_graph[transform_name], + ) + + for transform_name in only_graph: + LOGGER.info( + f"{self.name} | TransformPython '{transform_name}' not found locally in branch {branch_name}, deleting" + ) + await transform_definition_in_graph[transform_name].delete() + async def get_check_definitions( self, branch_name: str, module: types.ModuleType, file_path: str ) -> List[CheckDefinitionInformation]: @@ -1460,6 +1532,39 @@ async def get_check_definitions( continue return checks + async def get_python_transforms( + self, branch_name: str, module: types.ModuleType, file_path: str + ) -> List[TransformPythonInformation]: + if INFRAHUB_TRANSFORM_VARIABLE_TO_IMPORT not in dir(module): + return [] + + transforms = [] + for transform_class in getattr(module, INFRAHUB_TRANSFORM_VARIABLE_TO_IMPORT): + graphql_query = await self.client.get( + kind="CoreGraphQLQuery", branch=branch_name, id=str(transform_class.query), populate_store=True + ) + try: + transforms.append( + TransformPythonInformation( + name=transform_class.__name__, + repository=str(self.id), + class_name=transform_class.__name__, + transform_class=transform_class, + file_path=file_path, + query=str(graphql_query.id), + timeout=transform_class.timeout, + rebase=transform_class.rebase, + url=transform_class.url, + ) + ) + + except Exception as exc: # pylint: disable=broad-exception-caught + LOGGER.error( + f"{self.name} | An error occured while processing the PythonTransform {transform_class.__name__} from {file_path} : {exc} " + ) + continue + return transforms + async def create_python_check_definition(self, branch_name: str, check: CheckDefinitionInformation) -> InfrahubNode: data = { "name": check.name, @@ -1525,70 +1630,6 @@ async def compare_python_check_definition( return False return True - async def import_python_transforms_from_module(self, branch_name: str, commit: str, module, file_path: str): - # TODO add function to validate if a check is valid - - if INFRAHUB_TRANSFORM_VARIABLE_TO_IMPORT not in dir(module): - return False - - transforms_in_graph = { - transform.name.value: transform - for transform in await self.client.filters( - kind="CoreTransformPython", branch=branch_name, repository__ids=[str(self.id)] - ) - } - - local_transforms = {} - - for transform_class in getattr(module, INFRAHUB_TRANSFORM_VARIABLE_TO_IMPORT): - transform = transform_class() - - # Query the GraphQL query and (eventually) replace the name with the ID - graphql_query = await self.client.get( - kind="CoreGraphQLQuery", branch=branch_name, id=str(transform.query), populate_store=True - ) - - item = TransformPythonInformation( - name=transform.name, - repository=str(self.id), - query=str(graphql_query.id), - file_path=file_path, - url=transform.url, - transform_class=transform, - class_name=transform_class.__name__, - rebase=transform.rebase, - timeout=transform.timeout, - ) - local_transforms[item.name] = item - - present_in_both, only_graph, only_local = compare_lists( - list1=list(transforms_in_graph.keys()), list2=list(local_transforms.keys()) - ) - - for transform_name in only_local: - LOGGER.info( - f"{self.name} | New Python Transform '{transform_name}' found on branch {branch_name} ({commit[:8]}), creating" - ) - await self.create_python_transform(branch_name=branch_name, transform=local_transforms[transform_name]) - - for transform_name in present_in_both: - if not await self.compare_python_transform( - existing_transform=transforms_in_graph[transform_name], local_transform=local_transforms[transform_name] - ): - LOGGER.info( - f"{self.name} | New version of the Python Transform '{transform_name}' found on branch {branch_name} ({commit[:8]}), updating" - ) - await self.update_python_transform( - existing_transform=transforms_in_graph[transform_name], - local_transform=local_transforms[transform_name], - ) - - for transform_name in only_graph: - LOGGER.info( - f"{self.name} | Python Transform '{transform_name}' not found locally in branch {branch_name} ({commit[:8]}), deleting" - ) - await transforms_in_graph[transform_name].delete() - async def create_python_transform(self, branch_name: str, transform: TransformPythonInformation) -> InfrahubNode: schema = await self.client.schema.get(kind="CoreTransformPython", branch=branch_name) data = { @@ -1647,30 +1688,7 @@ async def compare_python_transform( async def import_all_python_files(self, branch_name: str, commit: str, config_file: InfrahubRepositoryConfig): await self.import_python_check_definitions(branch_name=branch_name, commit=commit, config_file=config_file) - commit_wt = self.get_worktree(identifier=commit) - - python_files = await self.find_files(extension=["py"], commit=commit) - - # Ensure the path for this repository is present in sys.path - if self.directory_root not in sys.path: - sys.path.append(self.directory_root) - - for python_file in python_files: - LOGGER.debug(f"{self.name} | Checking {python_file}") - - file_info = extract_repo_file_information( - full_filename=python_file, repo_directory=self.directory_root, worktree_directory=commit_wt.directory - ) - - try: - module = importlib.import_module(file_info.module_name) - except ModuleNotFoundError: - LOGGER.warning(f"{self.name} | Unable to load python file {python_file}") - continue - - await self.import_python_transforms_from_module( - branch_name=branch_name, commit=commit, module=module, file_path=file_info.relative_path_file - ) + await self.import_python_transforms(branch_name=branch_name, commit=commit, config_file=config_file) async def find_files( self, diff --git a/backend/tests/fixtures/repos/infrahub-demo-edge/.infrahub.yml b/backend/tests/fixtures/repos/infrahub-demo-edge/.infrahub.yml index 3a77213c65..87ee4a7959 100644 --- a/backend/tests/fixtures/repos/infrahub-demo-edge/.infrahub.yml +++ b/backend/tests/fixtures/repos/infrahub-demo-edge/.infrahub.yml @@ -33,3 +33,6 @@ artifact_definitions: check_definitions: - file_path: "checks/check_backbone_link_redundancy.py" + +python_transforms: + - file_path: "transforms/openconfig.py" diff --git a/python_sdk/infrahub_sdk/schema.py b/python_sdk/infrahub_sdk/schema.py index e6329ffdb5..5fd16fe1b2 100644 --- a/python_sdk/infrahub_sdk/schema.py +++ b/python_sdk/infrahub_sdk/schema.py @@ -67,11 +67,16 @@ class InfrahubCheckDefinitionConfig(pydantic.BaseModel): file_path: Path = pydantic.Field(..., description="The file within the repo with the check code.") +class InfrahubPythonTransformConfig(pydantic.BaseModel): + file_path: Path = pydantic.Field(..., description="The file within the repo with the transform code.") + + class InfrahubRepositoryConfig(pydantic.BaseModel): check_definitions: List[InfrahubCheckDefinitionConfig] = pydantic.Field(default_factory=list) schemas: List[Path] = pydantic.Field(default_factory=list) rfiles: List[InfrahubRepositoryRFileConfig] = pydantic.Field(default_factory=list) artifact_definitions: List[InfrahubRepositoryArtifactDefinitionConfig] = pydantic.Field(default_factory=list) + python_transforms: List[InfrahubPythonTransformConfig] = pydantic.Field(default_factory=list) # --------------------------------------------------------------------------------- From 12a65434f99901499b068c8010cbc6df64ff5c9a Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Mon, 20 Nov 2023 11:51:12 +0100 Subject: [PATCH 103/446] Reactivate coverage report for CTL --- .github/workflows/ci.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index a0282ba7fd..72fb398437 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -151,7 +151,7 @@ jobs: - name: "Mypy Tests" run: "poetry run mypy --show-error-codes infrahub_sdk/ infrahub_ctl/" - name: "Unit Tests" - run: "poetry run pytest -v --cov=infrahub_sdk tests/unit" + run: "poetry run pytest -v --cov=infrahub_sdk --cov=infrahub_ctl tests/unit" env: BUILDKITE_ANALYTICS_TOKEN: ${{ secrets.BUILDKITE_SDK_UNIT }} - name: "Coveralls : Unit Tests" From bc7cd0f659f9b4e9c2435b7f4473536e7bfd3240 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Mon, 20 Nov 2023 12:15:33 +0100 Subject: [PATCH 104/446] Fix typehints for infrahub.storage --- backend/infrahub/storage/main.py | 8 +++++--- pyproject.toml | 4 ---- 2 files changed, 5 insertions(+), 7 deletions(-) diff --git a/backend/infrahub/storage/main.py b/backend/infrahub/storage/main.py index 9c89ed06b9..994b6c1cb3 100644 --- a/backend/infrahub/storage/main.py +++ b/backend/infrahub/storage/main.py @@ -1,13 +1,15 @@ +from typing import Any + from typing_extensions import Self class InfrahubObjectStorage: @classmethod - async def init(cls, **kwargs) -> Self: + async def init(cls, **kwargs: Any) -> Self: return cls(**kwargs) - async def store(self, identifier: str, content: bytes): + async def store(self, identifier: str, content: bytes) -> None: raise NotImplementedError - async def retrieve(self, identifier: str): + async def retrieve(self, identifier: str) -> str: raise NotImplementedError diff --git a/pyproject.toml b/pyproject.toml index 93be611380..1ed3c3b8df 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -213,10 +213,6 @@ disallow_untyped_defs = false module = "infrahub.server" ignore_errors = true -[[tool.mypy.overrides]] -module = "infrahub.storage.main" -disallow_untyped_defs = false - [[tool.mypy.overrides]] module = "infrahub.tasks.registry" ignore_errors = true From 6864e0b491450f04ac662145ba2ddddc3ad8b999 Mon Sep 17 00:00:00 2001 From: Bilal Date: Mon, 20 Nov 2023 16:51:37 +0100 Subject: [PATCH 105/446] added test e2e for profile page --- frontend/src/components/avatar.tsx | 6 +++-- frontend/src/screens/layout/header.tsx | 5 +--- .../src/screens/user-profile/user-profile.tsx | 2 +- frontend/tests/e2e/profile.cy.ts | 26 +++++++++++++++++++ 4 files changed, 32 insertions(+), 7 deletions(-) create mode 100644 frontend/tests/e2e/profile.cy.ts diff --git a/frontend/src/components/avatar.tsx b/frontend/src/components/avatar.tsx index d4daf34050..b02efa9461 100644 --- a/frontend/src/components/avatar.tsx +++ b/frontend/src/components/avatar.tsx @@ -31,7 +31,7 @@ const getAvatarSize = (size?: AVATAR_SIZE) => { }; export const Avatar = (props: tAvatar) => { - const { name, image, size, className } = props; + const { name, image, size, className, ...otherProps } = props; if (image) { return ( @@ -39,6 +39,7 @@ export const Avatar = (props: tAvatar) => { className={`${getAvatarSize(size)} rounded-full object-cover`} src={image} alt="Avatar" + {...otherProps} /> ); } else { @@ -48,7 +49,8 @@ export const Avatar = (props: tAvatar) => { getAvatarSize(size), "rounded-full bg-custom-blue-200 text-custom-white flex justify-center items-center", className ?? "" - )}> + )} + {...otherProps}> {initials(name)}
); diff --git a/frontend/src/screens/layout/header.tsx b/frontend/src/screens/layout/header.tsx index 11a495d9d0..2aa9b6519d 100644 --- a/frontend/src/screens/layout/header.tsx +++ b/frontend/src/screens/layout/header.tsx @@ -174,10 +174,7 @@ export default function Header(props: Props) { - +
- {renderContent(qspTab)} +
{renderContent(qspTab)}
); } diff --git a/frontend/tests/e2e/profile.cy.ts b/frontend/tests/e2e/profile.cy.ts new file mode 100644 index 0000000000..2b0d39fbe7 --- /dev/null +++ b/frontend/tests/e2e/profile.cy.ts @@ -0,0 +1,26 @@ +/// + +import { ADMIN_CREDENTIALS } from "../utils"; + +describe("Profile page", () => { + it("should access and display all information's about user", () => { + cy.login(ADMIN_CREDENTIALS.username, ADMIN_CREDENTIALS.password); + cy.visit("/"); + + cy.get("[data-cy='user-avatar']").click(); + cy.contains("Your Profile").click(); + cy.url().should("include", "/profile"); + + cy.get("[data-cy='user-details']").within(() => { + cy.contains("Name").next().should("contain", "admin"); + cy.contains("Label").next().should("contain", "Admin"); + cy.contains("Description").next().should("contain", ""); + cy.contains("Type").next().should("contain", "User"); + cy.contains("Role").next().should("contain", "admin"); + }); + + cy.contains("Preference").click(); + cy.url().should("include", "tab=preferences"); + cy.contains("Update your password").should("be.visible"); + }); +}); From 3379332e149fde4c7da85de232f75e9357ba07bd Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Mon, 20 Nov 2023 13:05:20 +0100 Subject: [PATCH 106/446] Update schema for Python checks in .infrahub.yml Add parameters and targets. If targets is empty this will be a global check, otherwise it will target a specific group. Also introduces InfrahubCheckInitializer, this is to provide information to the check that user created code could act upon. For now it contains information about which proposed change (if any) triggered the check to run. The reason for this is to support that use case with checking something within an external system such as ServiceNow to see if a given proposed change has been approved. We don't yet inject the proposed change into the check that will come in a follow up PR. Related to #1049, will move out User checks from the RepositoryValidators as a next step and also add the ability to target groups of devices for these checks. --- python_sdk/infrahub_sdk/checks.py | 22 +++++++++++++++++++++- python_sdk/infrahub_sdk/schema.py | 6 ++++++ 2 files changed, 27 insertions(+), 1 deletion(-) diff --git a/python_sdk/infrahub_sdk/checks.py b/python_sdk/infrahub_sdk/checks.py index 023a306989..f6a7b0a5c1 100644 --- a/python_sdk/infrahub_sdk/checks.py +++ b/python_sdk/infrahub_sdk/checks.py @@ -10,18 +10,38 @@ from infrahub_sdk import InfrahubClient +try: + from pydantic import v1 as pydantic # type: ignore[attr-defined] +except ImportError: + import pydantic # type: ignore[no-redef] + INFRAHUB_CHECK_VARIABLE_TO_IMPORT = "INFRAHUB_CHECKS" +class InfrahubCheckInitializer(pydantic.BaseModel): + """Information about the originator of the check.""" + + proposed_change_id: str = pydantic.Field( + default="", description="If available the ID of the proposed change that requested the check" + ) + + class InfrahubCheck: name: Optional[str] = None query: str = "" timeout: int = 10 rebase: bool = True - def __init__(self, branch: str = "", root_directory: str = "", output: Optional[str] = None): + def __init__( + self, + branch: str = "", + root_directory: str = "", + output: Optional[str] = None, + initializer: Optional[InfrahubCheckInitializer] = None, + ): self.data: Dict = {} self.git: Optional[Repo] = None + self.initializer = initializer or InfrahubCheckInitializer() self.logs: List[Dict[str, Any]] = [] self.passed = False diff --git a/python_sdk/infrahub_sdk/schema.py b/python_sdk/infrahub_sdk/schema.py index 5fd16fe1b2..de5a040008 100644 --- a/python_sdk/infrahub_sdk/schema.py +++ b/python_sdk/infrahub_sdk/schema.py @@ -65,6 +65,12 @@ def payload(self) -> Dict[str, str]: class InfrahubCheckDefinitionConfig(pydantic.BaseModel): file_path: Path = pydantic.Field(..., description="The file within the repo with the check code.") + parameters: Dict[str, Any] = pydantic.Field( + default_factory=dict, description="The input parameters required to run this check" + ) + targets: Optional[str] = pydantic.Field( + default=None, description="The group to target when running this check, leave blank for global checks" + ) class InfrahubPythonTransformConfig(pydantic.BaseModel): From ed530ba7f1ea80a186a2f973120f725ea13106c6 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Mon, 20 Nov 2023 14:32:21 +0100 Subject: [PATCH 107/446] Add missing documentation for infrahubctl and automate rendering Previously the commands had to be statically defined, with this change new commands would be picked up automatically. --- docs/infrahubctl/infrahubctl-render.md | 23 +++++++++++++++++++++++ docs/infrahubctl/infrahubctl-run.md | 5 +++++ python_sdk/infrahub_ctl/cli.py | 4 ++-- python_sdk/tests/unit/ctl/test_cli.py | 12 ++++++++++++ tasks/ctl.py | 17 +++++++++-------- 5 files changed, 51 insertions(+), 10 deletions(-) create mode 100644 docs/infrahubctl/infrahubctl-render.md diff --git a/docs/infrahubctl/infrahubctl-render.md b/docs/infrahubctl/infrahubctl-render.md new file mode 100644 index 0000000000..6931c32291 --- /dev/null +++ b/docs/infrahubctl/infrahubctl-render.md @@ -0,0 +1,23 @@ +# `infrahubctl render` + +Render a local Jinja Template (RFile) for debugging purpose. + +**Usage**: + +```console +$ infrahubctl render [OPTIONS] RFILE [VARIABLES]... +``` + +**Arguments**: + +* `RFILE`: [required] +* `[VARIABLES]...`: Variables to pass along with the query. Format key=value key=value. + +**Options**: + +* `--branch TEXT`: Branch on which to rendre the RFile. +* `--debug / --no-debug`: [default: no-debug] +* `--config-file TEXT`: [env var: INFRAHUBCTL_CONFIG; default: infrahubctl.toml] +* `--install-completion`: Install completion for the current shell. +* `--show-completion`: Show completion for the current shell, to copy it or customize the installation. +* `--help`: Show this message and exit. diff --git a/docs/infrahubctl/infrahubctl-run.md b/docs/infrahubctl/infrahubctl-run.md index d64fb45b4f..d61d846531 100644 --- a/docs/infrahubctl/infrahubctl-run.md +++ b/docs/infrahubctl/infrahubctl-run.md @@ -17,4 +17,9 @@ $ infrahubctl run [OPTIONS] SCRIPT * `--method TEXT`: [default: run] * `--debug / --no-debug`: [default: no-debug] * `--config-file TEXT`: [env var: INFRAHUBCTL_CONFIG; default: infrahubctl.toml] +* `--branch TEXT`: Branch on which to run the script. [default: main] +* `--concurrent INTEGER`: Maximum number of requets to execute at the same time. [env var: INFRAHUBCTL_CONCURRENT_EXECUTION; default: 4] +* `--timeout INTEGER`: Timeout in sec [env var: INFRAHUBCTL_TIMEOUT; default: 60] +* `--install-completion`: Install completion for the current shell. +* `--show-completion`: Show completion for the current shell, to copy it or customize the installation. * `--help`: Show this message and exit. diff --git a/python_sdk/infrahub_ctl/cli.py b/python_sdk/infrahub_ctl/cli.py index 242d69d518..e944b85730 100644 --- a/python_sdk/infrahub_ctl/cli.py +++ b/python_sdk/infrahub_ctl/cli.py @@ -104,7 +104,7 @@ def identify_faulty_jinja_code(traceback: Traceback, nbr_context_lines: int = 3) return response -@app.command() +@app.command(name="render") def render( # pylint: disable=too-many-branches,too-many-statements rfile: str, variables: Optional[List[str]] = typer.Argument( @@ -220,7 +220,7 @@ def render( # pylint: disable=too-many-branches,too-many-statements print(rendered_tpl) -@app.command() +@app.command(name="run") def run( script: Path, method: str = "run", diff --git a/python_sdk/tests/unit/ctl/test_cli.py b/python_sdk/tests/unit/ctl/test_cli.py index 84f77bbc72..f2e0a87e08 100644 --- a/python_sdk/tests/unit/ctl/test_cli.py +++ b/python_sdk/tests/unit/ctl/test_cli.py @@ -9,3 +9,15 @@ def test_main_app(): result = runner.invoke(app, ["--help"]) assert result.exit_code == 0 assert "[OPTIONS] COMMAND [ARGS]" in result.stdout + + +def test_validate_all_commands_have_names(): + assert app.registered_commands + for command in app.registered_commands: + assert command.name + + +def test_validate_all_groups_have_names(): + assert app.registered_groups + for group in app.registered_groups: + assert group.name diff --git a/tasks/ctl.py b/tasks/ctl.py index 22fe130717..ccd9f903f0 100644 --- a/tasks/ctl.py +++ b/tasks/ctl.py @@ -20,16 +20,17 @@ @task def generate_doc(context: Context): """Generate the documentation for infrahubctl using typer-cli.""" + from infrahub_ctl.cli import app - CLI_COMMANDS = ( - ("infrahub_ctl.branch", "infrahubctl branch", "infrahubctl-branch"), - ("infrahub_ctl.schema", "infrahubctl schema", "infrahubctl-schema"), - ("infrahub_ctl.validate", "infrahubctl validate", "infrahubctl-validate"), - ("infrahub_ctl.check", "infrahubctl check", "infrahubctl-check"), - ) print(f" - [{NAMESPACE}] Generate CLI documentation") - for command in CLI_COMMANDS: - exec_cmd = f'typer {command[0]} utils docs --name "{command[1]}" --output docs/infrahubctl/{command[2]}.md' + for cmd in app.registered_commands: + exec_cmd = f'typer --func {cmd.name} infrahub_ctl.cli utils docs --name "infrahubctl {cmd.name}"' + exec_cmd += f" --output docs/infrahubctl/infrahubctl-{cmd.name}.md" + with context.cd(ESCAPED_REPO_PATH): + context.run(exec_cmd) + + for cmd in app.registered_groups: + exec_cmd = f'typer infrahub_ctl.{cmd.name} utils docs --name "infrahubctl {cmd.name}" --output docs/infrahubctl/infrahubctl-{cmd.name}.md' with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) From 333da33724f7b572263505ab0241e8d2417341c8 Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Tue, 21 Nov 2023 10:25:36 +0100 Subject: [PATCH 108/446] Move generation of all automated documentation to one command We can use this single command as part of the build process and also add to CI to verify that we don't have automated documentation that's not up to date. --- backend/infrahub/cli/__init__.py | 2 - backend/infrahub/cli/doc.py | 40 ---------------- docs/reference/schema/attribute.md | 10 ++++ docs/reference/schema/node.md | 30 ++++++++++++ tasks/__init__.py | 6 --- tasks/backend.py | 20 -------- tasks/ctl.py | 21 --------- tasks/docs.py | 75 ++++++++++++++++++++++++++++++ 8 files changed, 115 insertions(+), 89 deletions(-) delete mode 100644 backend/infrahub/cli/doc.py diff --git a/backend/infrahub/cli/__init__.py b/backend/infrahub/cli/__init__.py index 922760c3c8..0c0de3d33b 100644 --- a/backend/infrahub/cli/__init__.py +++ b/backend/infrahub/cli/__init__.py @@ -4,7 +4,6 @@ import infrahub.config as config from infrahub.cli.db import app as db_app -from infrahub.cli.doc import app as doc_app from infrahub.cli.events import app as events_app from infrahub.cli.generate_schema import app as generate_schema_app from infrahub.cli.git_agent import app as git_app @@ -21,7 +20,6 @@ app.add_typer(db_app, name="db") app.add_typer(events_app, name="events", help="Interact with the events system.") app.add_typer(generate_schema_app, name="generate-schema") -app.add_typer(doc_app, name="doc") async def _init_shell(config_file: str) -> None: diff --git a/backend/infrahub/cli/doc.py b/backend/infrahub/cli/doc.py deleted file mode 100644 index ae6f6e8e28..0000000000 --- a/backend/infrahub/cli/doc.py +++ /dev/null @@ -1,40 +0,0 @@ -import os -from pathlib import Path - -import jinja2 -import typer - -from infrahub.core.schema import internal_schema - -app = typer.Typer() - - -DOCUMENTATION_DIRECTORY = "../../../docs" - - -@app.command() -def generate_schema() -> None: - """Generate documentation for the schema""" - - schemas_to_generate = ["node", "attribute", "relationship", "generic"] - here = os.path.abspath(os.path.dirname(__file__)) - - for schema_name in schemas_to_generate: - template_file = os.path.join(here, f"{DOCUMENTATION_DIRECTORY}/reference/schema/{schema_name}.j2") - output_file = os.path.join(here, f"{DOCUMENTATION_DIRECTORY}/reference/schema/{schema_name}.md") - if not os.path.exists(template_file): - print(f"Unable to find the template file at {template_file}") - raise typer.Exit(1) - - template_text = Path(template_file).read_text(encoding="utf-8") - - environment = jinja2.Environment() - template = environment.from_string(template_text) - rendered_file = template.render(schema=internal_schema) - - with open(output_file, "w", encoding="utf-8") as f: - f.write(rendered_file) - - print(f"Schema generated for {schema_name}") - - print("Schema documentation generated") diff --git a/docs/reference/schema/attribute.md b/docs/reference/schema/attribute.md index 8fbef3f2ba..b407c8b764 100644 --- a/docs/reference/schema/attribute.md +++ b/docs/reference/schema/attribute.md @@ -99,6 +99,16 @@ Below is the list of all available options to define an Attribute in the schema | **Constraints** |
Length: min -, max - | +## read_only + +| -- | -- | { class="compact" } +| ---- | --------------- | +| **Name** | read_only | +| **Kind** | `Boolean` | +| **Description** | | +| **Constraints** | | + + ## unique | -- | -- | { class="compact" } diff --git a/docs/reference/schema/node.md b/docs/reference/schema/node.md index db38005159..c19e13ebd3 100644 --- a/docs/reference/schema/node.md +++ b/docs/reference/schema/node.md @@ -77,6 +77,36 @@ Below is the list of all available options to define a node in the schema | **Constraints** | | +## include_in_menu + +| -- | -- | { class="compact" } +| ---- | --------------- | +| **Name** | include_in_menu | +| **Kind** | `Boolean` | +| **Description** | Defines if objects of this kind should be included in the menu. | +| **Constraints** | | + + +## menu_placement + +| -- | -- | { class="compact" } +| ---- | --------------- | +| **Name** | menu_placement | +| **Kind** | `Text` | +| **Description** | Defines where in the menu this object should be placed. | +| **Constraints** | | + + +## icon + +| -- | -- | { class="compact" } +| ---- | --------------- | +| **Name** | icon | +| **Kind** | `Text` | +| **Description** | Defines the icon to be used for this object type. | +| **Constraints** | | + + ## order_by | -- | -- | { class="compact" } diff --git a/tasks/__init__.py b/tasks/__init__.py index e40ba3daad..0d26367d5a 100644 --- a/tasks/__init__.py +++ b/tasks/__init__.py @@ -40,12 +40,6 @@ def lint_all(context: Context): sync.lint(context) -@task(name="generate-doc") -def generate_doc(context: Context): - backend.generate_doc(context) - - ns.add_task(format_all) ns.add_task(lint_all) ns.add_task(yamllint) -ns.add_task(generate_doc) diff --git a/tasks/backend.py b/tasks/backend.py index 2ecd64259e..4d83b4d675 100644 --- a/tasks/backend.py +++ b/tasks/backend.py @@ -15,26 +15,6 @@ NAMESPACE = "BACKEND" -# ---------------------------------------------------------------------------- -# Documentation -# ---------------------------------------------------------------------------- -@task -def generate_doc(context: Context): - """Generate the documentation for infrahub cli using typer-cli.""" - - CLI_COMMANDS = ( - ("infrahub.cli.db", "infrahub db", "infrahub-db"), - ("infrahub.cli.server", "infrahub server", "infrahub-server"), - ("infrahub.cli.git_agent", "infrahub git-agent", "infrahub-git-agent"), - ) - - print(f" - [{NAMESPACE}] Generate CLI documentation") - with context.cd(ESCAPED_REPO_PATH): - for command in CLI_COMMANDS: - exec_cmd = f'typer {command[0]} utils docs --name "{command[1]}" --output docs/reference/infrahub-cli/{command[2]}.md' - context.run(exec_cmd) - - # ---------------------------------------------------------------------------- # Formatting tasks # ---------------------------------------------------------------------------- diff --git a/tasks/ctl.py b/tasks/ctl.py index ccd9f903f0..232a77417d 100644 --- a/tasks/ctl.py +++ b/tasks/ctl.py @@ -14,27 +14,6 @@ NAMESPACE = "CTL" -# ---------------------------------------------------------------------------- -# Documentation -# ---------------------------------------------------------------------------- -@task -def generate_doc(context: Context): - """Generate the documentation for infrahubctl using typer-cli.""" - from infrahub_ctl.cli import app - - print(f" - [{NAMESPACE}] Generate CLI documentation") - for cmd in app.registered_commands: - exec_cmd = f'typer --func {cmd.name} infrahub_ctl.cli utils docs --name "infrahubctl {cmd.name}"' - exec_cmd += f" --output docs/infrahubctl/infrahubctl-{cmd.name}.md" - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - for cmd in app.registered_groups: - exec_cmd = f'typer infrahub_ctl.{cmd.name} utils docs --name "infrahubctl {cmd.name}" --output docs/infrahubctl/infrahubctl-{cmd.name}.md' - with context.cd(ESCAPED_REPO_PATH): - context.run(exec_cmd) - - # ---------------------------------------------------------------------------- # Formatting tasks # ---------------------------------------------------------------------------- diff --git a/tasks/docs.py b/tasks/docs.py index 57b9264674..c1f0e573e8 100644 --- a/tasks/docs.py +++ b/tasks/docs.py @@ -1,9 +1,14 @@ +import os import sys +from pathlib import Path from invoke import Context, task from .utils import ESCAPED_REPO_PATH +CURRENT_DIRECTORY = os.path.abspath(os.path.dirname(__file__)) +DOCUMENTATION_DIRECTORY = os.path.join(CURRENT_DIRECTORY, "../docs") + @task def build(context: Context): @@ -24,6 +29,14 @@ def build(context: Context): sys.exit(-1) +@task +def generate(context: Context): + """Generate documentation output from code.""" + _generate_infrahub_cli_documentation(context=context) + _generate_infrahubctl_documentation(context=context) + _generate_infrahub_schema_documentation() + + @task def serve(context: Context): """Run documentation server in development mode.""" @@ -32,3 +45,65 @@ def serve(context: Context): with context.cd(ESCAPED_REPO_PATH): context.run(exec_cmd) + + +def _generate_infrahub_cli_documentation(context: Context): + """Generate the documentation for infrahub cli using typer-cli.""" + + CLI_COMMANDS = ( + ("infrahub.cli.db", "infrahub db", "infrahub-db"), + ("infrahub.cli.server", "infrahub server", "infrahub-server"), + ("infrahub.cli.git_agent", "infrahub git-agent", "infrahub-git-agent"), + ) + + print(" - Generate Infrahub CLI documentation") + with context.cd(ESCAPED_REPO_PATH): + for command in CLI_COMMANDS: + exec_cmd = f'typer {command[0]} utils docs --name "{command[1]}" --output docs/reference/infrahub-cli/{command[2]}.md' + context.run(exec_cmd) + + +def _generate_infrahubctl_documentation(context: Context): + """Generate the documentation for infrahubctl using typer-cli.""" + from infrahub_ctl.cli import app + + print(" - Generate infrahubctl CLI documentation") + for cmd in app.registered_commands: + exec_cmd = f'typer --func {cmd.name} infrahub_ctl.cli utils docs --name "infrahubctl {cmd.name}"' + exec_cmd += f" --output docs/infrahubctl/infrahubctl-{cmd.name}.md" + with context.cd(ESCAPED_REPO_PATH): + context.run(exec_cmd) + + for cmd in app.registered_groups: + exec_cmd = f'typer infrahub_ctl.{cmd.name} utils docs --name "infrahubctl {cmd.name}" --output docs/infrahubctl/infrahubctl-{cmd.name}.md' + with context.cd(ESCAPED_REPO_PATH): + context.run(exec_cmd) + + +def _generate_infrahub_schema_documentation() -> None: + """Generate documentation for the schema""" + import jinja2 + + from infrahub.core.schema import internal_schema + + schemas_to_generate = ["node", "attribute", "relationship", "generic"] + + for schema_name in schemas_to_generate: + template_file = f"{DOCUMENTATION_DIRECTORY}/reference/schema/{schema_name}.j2" + output_file = f"{DOCUMENTATION_DIRECTORY}/reference/schema/{schema_name}.md" + if not os.path.exists(template_file): + print(f"Unable to find the template file at {template_file}") + sys.exit(-1) + + template_text = Path(template_file).read_text(encoding="utf-8") + + environment = jinja2.Environment() + template = environment.from_string(template_text) + rendered_file = template.render(schema=internal_schema) + + with open(output_file, "w", encoding="utf-8") as f: + f.write(rendered_file) + + print(f"Schema generated for {schema_name}") + + print("Schema documentation generated") From e5f61c60af202ad212965bd28189868f9d39f6df Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Mon, 20 Nov 2023 17:19:10 +0100 Subject: [PATCH 109/446] Minor cleanup of how services are initialized --- backend/infrahub/message_bus/rpc.py | 16 ++++++---------- backend/infrahub/server.py | 12 +++++++----- 2 files changed, 13 insertions(+), 15 deletions(-) diff --git a/backend/infrahub/message_bus/rpc.py b/backend/infrahub/message_bus/rpc.py index baa6b85dee..270b04131c 100644 --- a/backend/infrahub/message_bus/rpc.py +++ b/backend/infrahub/message_bus/rpc.py @@ -8,11 +8,11 @@ from infrahub_sdk import UUIDT from infrahub import config -from infrahub.database import InfrahubDatabase, get_db from infrahub.log import clear_log_context, get_log_data, get_logger from infrahub.message_bus import messages from infrahub.message_bus.operations import execute_message -from infrahub.services import InfrahubServices +from infrahub.services import services +from infrahub.services.adapters.message_bus import InfrahubMessageBus from infrahub.services.adapters.message_bus.rabbitmq import RabbitMQMessageBus from infrahub.worker import WORKER_IDENTITY @@ -45,7 +45,7 @@ class InfrahubRpcClientBase: def __init__(self) -> None: self.futures: MutableMapping[str, asyncio.Future] = {} self.loop = asyncio.get_running_loop() - self.service: InfrahubServices = InfrahubServices() + self.rabbitmq: InfrahubMessageBus = InfrahubMessageBus() async def connect(self) -> InfrahubRpcClient: self.connection = await get_broker() @@ -103,12 +103,8 @@ async def connect(self) -> InfrahubRpcClient: await self.events_queue.bind(self.exchange, routing_key="refresh.registry.*") - db = InfrahubDatabase(driver=await get_db()) - self.service = InfrahubServices( - database=db, - message_bus=RabbitMQMessageBus( - channel=self.channel, exchange=self.exchange, delayed_exchange=self.delayed_exchange - ), + self.rabbitmq = RabbitMQMessageBus( + channel=self.channel, exchange=self.exchange, delayed_exchange=self.delayed_exchange ) return self @@ -123,7 +119,7 @@ async def on_response(self, message: AbstractIncomingMessage) -> None: clear_log_context() if message.routing_key in messages.MESSAGE_MAP: - await execute_message(routing_key=message.routing_key, message_body=message.body, service=self.service) + await execute_message(routing_key=message.routing_key, message_body=message.body, service=services.service) else: log.error("Invalid message received", message=f"{message!r}") diff --git a/backend/infrahub/server.py b/backend/infrahub/server.py index 6938aaf0b6..ce843d166a 100644 --- a/backend/infrahub/server.py +++ b/backend/infrahub/server.py @@ -16,8 +16,7 @@ from pydantic import ValidationError from starlette_exporter import PrometheusMiddleware, handle_metrics -import infrahub.config as config -from infrahub import __version__ +from infrahub import __version__, config from infrahub.api import router as api from infrahub.api.background import BackgroundRunner from infrahub.api.exception_handlers import generic_api_exception_handler @@ -30,7 +29,8 @@ from infrahub.message_bus import close_broker_connection, connect_to_broker from infrahub.message_bus.rpc import InfrahubRpcClient from infrahub.middleware import InfrahubCORSMiddleware -from infrahub.services import services +from infrahub.services import InfrahubServices, services +from infrahub.services.adapters.cache.redis import RedisCache from infrahub.trace import add_span_exception, configure_trace, get_traceid, get_tracer # pylint: disable=too-many-locals @@ -81,7 +81,7 @@ async def app_initialization(): ) # Initialize database Driver and load local registry - app.state.db = InfrahubDatabase(mode=InfrahubDatabaseMode.DRIVER, driver=await get_db()) + database = app.state.db = InfrahubDatabase(mode=InfrahubDatabaseMode.DRIVER, driver=await get_db()) initialize_lock() @@ -90,7 +90,9 @@ async def app_initialization(): # Initialize RPC Client app.state.rpc_client = await InfrahubRpcClient().connect() - services.prepare(service=app.state.rpc_client.service) + service = InfrahubServices(cache=RedisCache(), database=database) + service.message_bus = app.state.rpc_client.rabbitmq + services.prepare(service=service) async with app.state.db.start_session() as db: await initialization(db=db) From 0488c438884458b4049aeb152efb6a8cb297c466 Mon Sep 17 00:00:00 2001 From: Bilal Date: Mon, 20 Nov 2023 13:10:10 +0100 Subject: [PATCH 110/446] added isUnique prop to input component --- frontend/src/components-form/input.tsx | 1 + frontend/src/screens/edit-form-hook/dynamic-control-types.ts | 1 + frontend/src/utils/formStructureForCreateEdit.ts | 1 + 3 files changed, 3 insertions(+) diff --git a/frontend/src/components-form/input.tsx b/frontend/src/components-form/input.tsx index a49a169d99..a04a4ca999 100644 --- a/frontend/src/components-form/input.tsx +++ b/frontend/src/components-form/input.tsx @@ -12,6 +12,7 @@ type OpsInputProps = { type: string; isProtected?: boolean; isOptionnal?: boolean; + isUnique?: boolean; disabled?: boolean; }; diff --git a/frontend/src/screens/edit-form-hook/dynamic-control-types.ts b/frontend/src/screens/edit-form-hook/dynamic-control-types.ts index f86ce9839d..b613d37702 100644 --- a/frontend/src/screens/edit-form-hook/dynamic-control-types.ts +++ b/frontend/src/screens/edit-form-hook/dynamic-control-types.ts @@ -95,5 +95,6 @@ export interface DynamicFieldData { isProtected?: boolean; isOptionnal?: boolean; isReadOnly?: boolean; + isUnique?: boolean; disabled?: boolean; } diff --git a/frontend/src/utils/formStructureForCreateEdit.ts b/frontend/src/utils/formStructureForCreateEdit.ts index c0312fec9a..5f6f69525d 100644 --- a/frontend/src/utils/formStructureForCreateEdit.ts +++ b/frontend/src/utils/formStructureForCreateEdit.ts @@ -123,6 +123,7 @@ const getFormStructureForCreateEdit = ( }, isOptionnal: attribute.optional, isReadOnly: attribute.read_only, + isUnique: attribute.unique, isProtected: getIsDisabled({ owner: row && row[attribute.name]?.owner, user, From acaf961d44e5331912238fb25b9a5e3277d705d5 Mon Sep 17 00:00:00 2001 From: Bilal Date: Mon, 20 Nov 2023 13:12:46 +0100 Subject: [PATCH 111/446] added icon to indicate that an input value must be unique --- frontend/src/components-form/input.tsx | 31 +++++++++++++++++++++----- 1 file changed, 25 insertions(+), 6 deletions(-) diff --git a/frontend/src/components-form/input.tsx b/frontend/src/components-form/input.tsx index a04a4ca999..4298d868e4 100644 --- a/frontend/src/components-form/input.tsx +++ b/frontend/src/components-form/input.tsx @@ -1,7 +1,8 @@ -import { LockClosedIcon } from "@heroicons/react/24/outline"; import { Input } from "../components/input"; import { FormFieldError } from "../screens/edit-form-hook/form"; import { classNames } from "../utils/common"; +import { Icon } from "@iconify-icon/react"; +import { Tooltip } from "../components/tooltip"; type OpsInputProps = { label: string; @@ -16,16 +17,34 @@ type OpsInputProps = { disabled?: boolean; }; +type InputTooltip = { + label: string; +}; + +const InputUniqueTooltip = ({ label = "This field" }: InputTooltip) => ( + + + +); + +const InputProtectedTooltip = ({ label = "This field" }: InputTooltip) => ( + + + +); + export const OpsInput = (props: OpsInputProps) => { - const { className, onChange, value, label, error, isProtected, isOptionnal, disabled } = props; + const { className, onChange, value, label, error, isProtected, isOptionnal, isUnique, disabled } = + props; return ( <> -
-
{(attribute?.optional === false || attribute?.unique) && ( - + + + )} {(attribute?.unique === true || attribute?.unique) && ( - + + + )}
@@ -83,10 +83,10 @@ export default function ObjectRows(props: Props) {
{relationship.cardinality === "one" && ( - + )} {relationship.cardinality === "many" && ( - + )}
From 3a70ff800f28e5cb44b023fbe7185b3bdf14f298 Mon Sep 17 00:00:00 2001 From: Bilal Date: Mon, 20 Nov 2023 14:16:10 +0100 Subject: [PATCH 113/446] fix test --- frontend/src/utils/formStructureForCreateEdit.ts | 2 +- frontend/tests/mocks/data/accountToken.ts | 3 +++ 2 files changed, 4 insertions(+), 1 deletion(-) diff --git a/frontend/src/utils/formStructureForCreateEdit.ts b/frontend/src/utils/formStructureForCreateEdit.ts index 5f6f69525d..07bdf7bd59 100644 --- a/frontend/src/utils/formStructureForCreateEdit.ts +++ b/frontend/src/utils/formStructureForCreateEdit.ts @@ -123,13 +123,13 @@ const getFormStructureForCreateEdit = ( }, isOptionnal: attribute.optional, isReadOnly: attribute.read_only, - isUnique: attribute.unique, isProtected: getIsDisabled({ owner: row && row[attribute.name]?.owner, user, isProtected: row && row[attribute.name] && row[attribute.name].is_protected, isReadOnly: attribute.read_only, }), + isUnique: attribute.unique, }); }); diff --git a/frontend/tests/mocks/data/accountToken.ts b/frontend/tests/mocks/data/accountToken.ts index 47d55b24d2..0012cdc8dc 100644 --- a/frontend/tests/mocks/data/accountToken.ts +++ b/frontend/tests/mocks/data/accountToken.ts @@ -803,6 +803,7 @@ export const accountTokenFormStructure = [ config: {}, isOptionnal: true, isProtected: false, + isUnique: false, }, { name: "token.value", @@ -814,6 +815,7 @@ export const accountTokenFormStructure = [ config: {}, isOptionnal: false, isProtected: false, + isUnique: true, }, { name: "expiration.value", @@ -825,6 +827,7 @@ export const accountTokenFormStructure = [ config: {}, isOptionnal: true, isProtected: false, + isUnique: false, }, { name: "account.id", From 598a365d9ed312f7d36b95a45d73c3ef1e27298a Mon Sep 17 00:00:00 2001 From: Bilal Date: Tue, 21 Nov 2023 11:13:58 +0100 Subject: [PATCH 114/446] replace unique icon with subtext --- frontend/src/components-form/input.tsx | 18 +++++------------- 1 file changed, 5 insertions(+), 13 deletions(-) diff --git a/frontend/src/components-form/input.tsx b/frontend/src/components-form/input.tsx index 4298d868e4..9f2fbd51f8 100644 --- a/frontend/src/components-form/input.tsx +++ b/frontend/src/components-form/input.tsx @@ -17,18 +17,10 @@ type OpsInputProps = { disabled?: boolean; }; -type InputTooltip = { - label: string; -}; - -const InputUniqueTooltip = ({ label = "This field" }: InputTooltip) => ( - - - -); +const InputUniqueTips = () => must be unique; -const InputProtectedTooltip = ({ label = "This field" }: InputTooltip) => ( - +const InputProtectedTooltip = () => ( + ); @@ -43,8 +35,8 @@ export const OpsInput = (props: OpsInputProps) => { - {isUnique && } - {isProtected && } + {isProtected && } + {isUnique && }
Date: Tue, 21 Nov 2023 12:15:49 +0100 Subject: [PATCH 115/446] reverted lock icon for consistency with other inputs --- frontend/src/components-form/input.tsx | 11 ++--------- 1 file changed, 2 insertions(+), 9 deletions(-) diff --git a/frontend/src/components-form/input.tsx b/frontend/src/components-form/input.tsx index 9f2fbd51f8..101cc1a914 100644 --- a/frontend/src/components-form/input.tsx +++ b/frontend/src/components-form/input.tsx @@ -1,8 +1,7 @@ +import { LockClosedIcon } from "@heroicons/react/24/outline"; import { Input } from "../components/input"; import { FormFieldError } from "../screens/edit-form-hook/form"; import { classNames } from "../utils/common"; -import { Icon } from "@iconify-icon/react"; -import { Tooltip } from "../components/tooltip"; type OpsInputProps = { label: string; @@ -19,12 +18,6 @@ type OpsInputProps = { const InputUniqueTips = () => must be unique; -const InputProtectedTooltip = () => ( - - - -); - export const OpsInput = (props: OpsInputProps) => { const { className, onChange, value, label, error, isProtected, isOptionnal, isUnique, disabled } = props; @@ -35,7 +28,7 @@ export const OpsInput = (props: OpsInputProps) => { - {isProtected && } + {isProtected && } {isUnique && } Date: Tue, 21 Nov 2023 15:32:19 +0100 Subject: [PATCH 116/446] Add helm-chart for infrahub --- helm/.helmignore | 23 +++ helm/Chart.yaml | 21 +++ helm/templates/_helpers.tpl | 62 +++++++ helm/templates/cache.yaml | 57 ++++++ helm/templates/configmap.yaml | 42 +++++ helm/templates/database.yaml | 136 ++++++++++++++ helm/templates/infrahub-git.yaml | 120 ++++++++++++ .../infrahub-server-db-init-job.yaml | 51 ++++++ helm/templates/infrahub-server-ingress.yaml | 18 ++ helm/templates/infrahub-server.yaml | 161 +++++++++++++++++ helm/templates/job-viewer-role.yaml | 29 +++ helm/templates/message-queue.yaml | 83 +++++++++ helm/values.yaml | 171 ++++++++++++++++++ 13 files changed, 974 insertions(+) create mode 100644 helm/.helmignore create mode 100644 helm/Chart.yaml create mode 100644 helm/templates/_helpers.tpl create mode 100644 helm/templates/cache.yaml create mode 100644 helm/templates/configmap.yaml create mode 100644 helm/templates/database.yaml create mode 100644 helm/templates/infrahub-git.yaml create mode 100644 helm/templates/infrahub-server-db-init-job.yaml create mode 100644 helm/templates/infrahub-server-ingress.yaml create mode 100644 helm/templates/infrahub-server.yaml create mode 100644 helm/templates/job-viewer-role.yaml create mode 100644 helm/templates/message-queue.yaml create mode 100644 helm/values.yaml diff --git a/helm/.helmignore b/helm/.helmignore new file mode 100644 index 0000000000..0e8a0eb36f --- /dev/null +++ b/helm/.helmignore @@ -0,0 +1,23 @@ +# Patterns to ignore when building packages. +# This supports shell glob matching, relative path matching, and +# negation (prefixed with !). Only one pattern per line. +.DS_Store +# Common VCS dirs +.git/ +.gitignore +.bzr/ +.bzrignore +.hg/ +.hgignore +.svn/ +# Common backup files +*.swp +*.bak +*.tmp +*.orig +*~ +# Various IDEs +.project +.idea/ +*.tmproj +.vscode/ diff --git a/helm/Chart.yaml b/helm/Chart.yaml new file mode 100644 index 0000000000..4920d7bdfc --- /dev/null +++ b/helm/Chart.yaml @@ -0,0 +1,21 @@ +apiVersion: v2 +name: infrahub +description: A Helm chart to deploy Infrahub on Kubernetes +# A chart can be either an 'application' or a 'library' chart. +# +# Application charts are a collection of templates that can be packaged into versioned archives +# to be deployed. +# +# Library charts provide useful utilities or functions for the chart developer. They're included as +# a dependency of application charts to inject those utilities and functions into the rendering +# pipeline. Library charts do not define any templates and therefore cannot be deployed. +type: application +# This is the chart version. This version number should be incremented each time you make changes +# to the chart and its templates, including the app version. +# Versions are expected to follow Semantic Versioning (https://semver.org/) +version: 0.1.5 +# This is the version number of the application being deployed. This version number should be +# incremented each time you make changes to the application. Versions are not expected to +# follow Semantic Versioning. They should reflect the version the application is using. +# It is recommended to use it with quotes. +appVersion: "0.8.2" diff --git a/helm/templates/_helpers.tpl b/helm/templates/_helpers.tpl new file mode 100644 index 0000000000..7dcd5b6cdf --- /dev/null +++ b/helm/templates/_helpers.tpl @@ -0,0 +1,62 @@ +{{/* +Expand the name of the chart. +*/}} +{{- define "infrahub-helm.name" -}} +{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" }} +{{- end }} + +{{/* +Create a default fully qualified app name. +We truncate at 63 chars because some Kubernetes name fields are limited to this (by the DNS naming spec). +If release name contains chart name it will be used as a full name. +*/}} +{{- define "infrahub-helm.fullname" -}} +{{- if .Values.fullnameOverride }} +{{- .Values.fullnameOverride | trunc 63 | trimSuffix "-" }} +{{- else }} +{{- $name := default .Chart.Name .Values.nameOverride }} +{{- if contains $name .Release.Name }} +{{- .Release.Name | trunc 63 | trimSuffix "-" }} +{{- else }} +{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" }} +{{- end }} +{{- end }} +{{- end }} + +{{/* +Create chart name and version as used by the chart label. +*/}} +{{- define "infrahub-helm.chart" -}} +{{- printf "%s-%s" .Chart.Name .Chart.Version | replace "+" "_" | trunc 63 | trimSuffix "-" }} +{{- end }} + +{{/* +Common labels +*/}} +{{- define "infrahub-helm.labels" -}} +helm.sh/chart: {{ include "infrahub-helm.chart" . }} +{{ include "infrahub-helm.selectorLabels" . }} +{{- if .Chart.AppVersion }} +app.kubernetes.io/version: {{ .Chart.AppVersion | quote }} +{{- end }} +app.kubernetes.io/managed-by: {{ .Release.Service }} +{{- end }} + +{{/* +Selector labels +*/}} +{{- define "infrahub-helm.selectorLabels" -}} +app.kubernetes.io/name: {{ include "infrahub-helm.name" . }} +app.kubernetes.io/instance: {{ .Release.Name }} +{{- end }} + +{{/* +Create the name of the service account to use +*/}} +{{- define "infrahub-helm.serviceAccountName" -}} +{{- if .Values.serviceAccount.create }} +{{- default (include "infrahub-helm.fullname" .) .Values.serviceAccount.name }} +{{- else }} +{{- default "default" .Values.serviceAccount.name }} +{{- end }} +{{- end }} diff --git a/helm/templates/cache.yaml b/helm/templates/cache.yaml new file mode 100644 index 0000000000..a5df3606a0 --- /dev/null +++ b/helm/templates/cache.yaml @@ -0,0 +1,57 @@ +apiVersion: apps/v1 +kind: Deployment +metadata: + name: {{ include "infrahub-helm.fullname" . }}-cache + labels: + io.kompose.service: cache + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + replicas: {{ .Values.cache.replicas }} + selector: + matchLabels: + io.kompose.service: cache + {{- include "infrahub-helm.selectorLabels" . | nindent 6 }} + template: + metadata: + labels: + io.kompose.service: cache + {{- include "infrahub-helm.selectorLabels" . | nindent 8 }} + spec: + containers: + - env: + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ quote .Values.kubernetesClusterDomain }} + image: {{ .Values.cache.cache.image.repository }}:{{ .Values.cache.cache.image.tag }} + livenessProbe: + exec: + command: + - sh + - -c + - redis-cli -a ping + failureThreshold: 3 + periodSeconds: 5 + timeoutSeconds: 5 + name: cache + ports: + {{- range .Values.cache.ports }} + - containerPort: {{ .targetPort }} + hostPort: {{ .port }} + {{- end }} + resources: {} + restartPolicy: Always + +--- +apiVersion: v1 +kind: Service +metadata: + name: {{ include "infrahub-helm.fullname" . }}-cache + labels: + io.kompose.service: cache + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + type: {{ .Values.cache.type }} + selector: + io.kompose.service: cache + {{- include "infrahub-helm.selectorLabels" . | nindent 4 }} + ports: + {{- .Values.cache.ports | toYaml | nindent 2 }} diff --git a/helm/templates/configmap.yaml b/helm/templates/configmap.yaml new file mode 100644 index 0000000000..20ca7e6880 --- /dev/null +++ b/helm/templates/configmap.yaml @@ -0,0 +1,42 @@ +apiVersion: v1 +kind: ConfigMap +metadata: + name: {{ .Chart.Name }}-{{ .Chart.AppVersion }}-configmap +data: + infrahub.toml: | + [main] + internal_address = "http://{{ include "infrahub-helm.fullname" . }}-server.{{ .Release.Namespace }}.svc.{{ .Values.kubernetesClusterDomain }}:8000" + + [git] + repositories_directory = "/opt/infrahub/git" + + [database] + username = "neo4j" + password = "admin" + address = "{{ include "infrahub-helm.fullname" . }}-database.{{ .Release.Namespace }}.svc.{{ .Values.kubernetesClusterDomain }}" + port = "{{ (index .Values.database.ports 2).port }}" + protocol = "bolt" + + [broker] + address = "{{ include "infrahub-helm.fullname" . }}-message-queue.{{ .Release.Namespace }}.svc.{{ .Values.kubernetesClusterDomain }}" + username = "infrahub" + password = "infrahub" + + [cache] + enable = true + address = "{{ include "infrahub-helm.fullname" . }}-cache.{{ .Release.Namespace }}.svc.{{ .Values.kubernetesClusterDomain }}" + port = "{{ (index .Values.cache.ports 0).port }}" + + [api] + cors_allow_origins = ["*"] + + [storage.settings] + directory = "/opt/infrahub/storage" + + [trace] + enable = false + insecure = "True" + exporter_type = "otlp" + exporter_protocol = "grpc" + exporter_endpoint = "tempo" # Assuming this is external or not managed by Helm + exporter_port = 4317 diff --git a/helm/templates/database.yaml b/helm/templates/database.yaml new file mode 100644 index 0000000000..2622d6b174 --- /dev/null +++ b/helm/templates/database.yaml @@ -0,0 +1,136 @@ +apiVersion: apps/v1 +kind: Deployment +metadata: + name: {{ include "infrahub-helm.fullname" . }}-database + labels: + io.kompose.service: database + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + replicas: {{ .Values.database.replicas }} + selector: + matchLabels: + io.kompose.service: database + {{- include "infrahub-helm.selectorLabels" . | nindent 6 }} + template: + metadata: + labels: + io.kompose.service: database + {{- include "infrahub-helm.selectorLabels" . | nindent 8 }} + spec: + containers: + - env: + - name: NEO4J_ACCEPT_LICENSE_AGREEMENT + value: {{ quote .Values.database.database.env.neo4JAcceptLicenseAgreement }} + - name: NEO4J_AUTH + value: {{ quote .Values.database.database.env.neo4JAuth }} + - name: NEO4J_dbms_security_auth__minimum__password__length + value: {{ quote .Values.database.database.env.neo4JDbmsSecurityAuthMinimumPasswordLength + }} + - name: NEO4J_dbms_security_procedures_unrestricted + value: {{ quote .Values.database.database.env.neo4JDbmsSecurityProceduresUnrestricted + }} + - name: NEO4J_server_metrics_prometheus_enabled + value: {{ quote .Values.database.database.env.neo4JServerMetricsPrometheusEnabled + }} + - name: NEO4J_server_metrics_prometheus_endpoint + value: {{ quote .Values.database.database.env.neo4JServerMetricsPrometheusEndpoint + }} + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ quote .Values.kubernetesClusterDomain }} + image: {{ .Values.database.database.image.repository }}:{{ .Values.database.database.image.tag }} + livenessProbe: + tcpSocket: + port: {{ (index .Values.database.ports 1).port }} + failureThreshold: 20 + initialDelaySeconds: 3 + periodSeconds: 2 + timeoutSeconds: 10 + name: database + ports: + {{- range .Values.database.ports }} + - containerPort: {{ .targetPort }} + hostPort: {{ .port }} + protocol: TCP + name: {{ .name }} + {{- end }} + resources: {} + volumeMounts: + - mountPath: /plugins + name: database-hostpath0 + readOnly: true + - mountPath: /data + name: database-data + - mountPath: /logs + name: database-logs + restartPolicy: Always + volumes: + - hostPath: + path: /tmp/plugins + name: database-hostpath0 + - hostPath: + path: /tmp/infrahub-helm + name: database-data + - hostPath: + path: /tmp/infrahub-helm + name: database-logs + +--- +apiVersion: v1 +kind: Service +metadata: + name: {{ include "infrahub-helm.fullname" . }}-database + labels: + io.kompose.service: database + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + type: {{ .Values.database.type }} + selector: + io.kompose.service: database + {{- include "infrahub-helm.selectorLabels" . | nindent 4 }} + ports: + {{- .Values.database.ports | toYaml | nindent 2 }} + +--- +apiVersion: v1 +kind: PersistentVolumeClaim +metadata: + name: {{ include "infrahub-helm.fullname" . }}-database-claim0 + labels: + io.kompose.service: database-claim0 + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + accessModes: + - ReadWriteOnce + resources: + requests: + storage: {{ .Values.pvc.databaseClaim0.storageRequest | quote }} + +--- +apiVersion: v1 +kind: PersistentVolumeClaim +metadata: + name: {{ include "infrahub-helm.fullname" . }}-database-data + labels: + io.kompose.service: database-data + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + accessModes: + - ReadWriteOnce + resources: + requests: + storage: {{ .Values.pvc.databaseData.storageRequest | quote }} + +--- +apiVersion: v1 +kind: PersistentVolumeClaim +metadata: + name: {{ include "infrahub-helm.fullname" . }}-database-logs + labels: + io.kompose.service: database-logs + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + accessModes: + - ReadWriteOnce + resources: + requests: + storage: {{ .Values.pvc.databaseLogs.storageRequest | quote }} diff --git a/helm/templates/infrahub-git.yaml b/helm/templates/infrahub-git.yaml new file mode 100644 index 0000000000..2b53fc88e4 --- /dev/null +++ b/helm/templates/infrahub-git.yaml @@ -0,0 +1,120 @@ +apiVersion: apps/v1 +kind: Deployment +metadata: + name: {{ include "infrahub-helm.fullname" . }}-infrahub-git + labels: + io.kompose.service: infrahub-git + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + replicas: {{ .Values.infrahubGit.replicas }} + selector: + matchLabels: + io.kompose.service: infrahub-git + {{- include "infrahub-helm.selectorLabels" . | nindent 6 }} + template: + metadata: + labels: + io.kompose.service: infrahub-git + {{- include "infrahub-helm.selectorLabels" . | nindent 8 }} + spec: + containers: + - args: {{- toYaml .Values.infrahubGit.infrahubGit.args | nindent 8 }} + env: + - name: INFRAHUB_ADDRESS + value: {{ quote .Values.infrahubGit.infrahubGit.env.infrahubAddress }} + - name: INFRAHUB_CACHE_PORT + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubCachePort }} + - name: INFRAHUB_CONFIG + value: {{ quote .Values.infrahubGit.infrahubGit.env.infrahubConfig }} + - name: INFRAHUB_DB_TYPE + value: {{ quote .Values.infrahubGit.infrahubGit.env.infrahubDbType }} + - name: INFRAHUB_LOG_LEVEL + value: {{ quote .Values.infrahubGit.infrahubGit.env.infrahubLogLevel }} + - name: INFRAHUB_PRODUCTION + value: {{ quote .Values.infrahubGit.infrahubGit.env.infrahubProduction }} + - name: INFRAHUB_SDK_API_TOKEN + value: {{ quote .Values.infrahubGit.infrahubGit.env.infrahubSdkApiToken }} + - name: INFRAHUB_SDK_TIMEOUT + value: {{ quote .Values.infrahubGit.infrahubGit.env.infrahubSdkTimeout }} + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ quote .Values.kubernetesClusterDomain }} + image: {{ .Values.infrahubGit.infrahubGit.image.repository }}:{{ .Values.infrahubGit.infrahubGit.image.tag | default .Chart.AppVersion }} + imagePullPolicy: {{ .Values.infrahubGit.infrahubGit.imagePullPolicy }} + name: infrahub-git + resources: {} + tty: true + volumeMounts: + - mountPath: /opt/infrahub/git + name: git-data + - mountPath: /remote + name: git-remote-data + - name: infrahub-config-volume + mountPath: /config + initContainers: + - command: + - sh + - -c + - until nslookup {{ include "infrahub-helm.fullname" . }}-infrahub-server.$(cat /var/run/secrets/kubernetes.io/serviceaccount/namespace).svc.{{ .Values.kubernetesClusterDomain }}; + do echo waiting for {{ include "infrahub-helm.fullname" . }}-infrahub-server; sleep 2; done; + env: + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ quote .Values.kubernetesClusterDomain }} + image: {{ .Values.infrahubGit.waitForInfrahubServer.image.repository }}:{{ .Values.infrahubGit.waitForInfrahubServer.image.tag | default .Chart.AppVersion }} + name: wait-for-infrahub-server + resources: {} + restartPolicy: Always + volumes: + - hostPath: + path: /tmp/infrahub-helm + name: git-data + - hostPath: + path: /tmp/infrahub-helm + name: git-remote-data + - name: infrahub-config-volume + configMap: + name: {{ .Chart.Name }}-{{ .Chart.AppVersion }}-configmap + +--- +apiVersion: v1 +kind: PersistentVolumeClaim +metadata: + name: {{ include "infrahub-helm.fullname" . }}-git-data + labels: + io.kompose.service: git-data + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + accessModes: + - ReadWriteOnce + resources: + requests: + storage: {{ .Values.pvc.gitData.storageRequest | quote }} + +--- +apiVersion: v1 +kind: PersistentVolumeClaim +metadata: + name: {{ include "infrahub-helm.fullname" . }}-git-remote-data + labels: + io.kompose.service: git-remote-data + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + accessModes: + - ReadWriteOnce + resources: + requests: + storage: {{ .Values.pvc.gitRemoteData.storageRequest | quote }} + +--- +apiVersion: v1 +kind: PersistentVolumeClaim +metadata: + name: {{ include "infrahub-helm.fullname" . }}-infrahub-git-claim2 + labels: + io.kompose.service: infrahub-git-claim2 + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + accessModes: + - ReadWriteOnce + resources: + requests: + storage: {{ .Values.pvc.infrahubGitClaim2.storageRequest | quote }} diff --git a/helm/templates/infrahub-server-db-init-job.yaml b/helm/templates/infrahub-server-db-init-job.yaml new file mode 100644 index 0000000000..5adc8fe2ef --- /dev/null +++ b/helm/templates/infrahub-server-db-init-job.yaml @@ -0,0 +1,51 @@ +apiVersion: batch/v1 +kind: Job +metadata: + name: {{ include "infrahub-helm.fullname" . }}-infrahub-server-db-init-job + labels: + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + backoffLimit: {{ .Values.infrahubServerDbInitJob.backoffLimit }} + template: + spec: + containers: + - command: + - sh + - -c + - infrahub db init + env: + - name: INFRAHUB_CACHE_PORT + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubCachePort }} + - name: INFRAHUB_CONFIG + value: {{ quote .Values.infrahubServerDbInitJob.infrahubServerDbInitJob.env.infrahubConfig }} + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ quote .Values.kubernetesClusterDomain }} + image: {{ .Values.infrahubServerDbInitJob.infrahubServerDbInitJob.image.repository }}:{{ .Values.infrahubServerDbInitJob.infrahubServerDbInitJob.image.tag | default .Chart.AppVersion }} + name: infrahub-server-db-init-job + resources: {} + volumeMounts: + - name: infrahub-config-volume + mountPath: /config + initContainers: + - command: + - sh + - -c + - until nslookup {{ include "infrahub-helm.fullname" . }}-database.$(cat /var/run/secrets/kubernetes.io/serviceaccount/namespace).svc.{{ .Values.kubernetesClusterDomain }}; + do echo waiting for {{ include "infrahub-helm.fullname" . }}-database; sleep 2; done; + env: + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ quote .Values.kubernetesClusterDomain }} + image: {{ .Values.infrahubServerDbInitJob.waitForDatabase.image.repository }}:{{ .Values.infrahubServerDbInitJob.waitForDatabase.image.tag | default .Chart.AppVersion }} + name: wait-for-database + resources: {} + restartPolicy: Never + volumes: + - hostPath: + path: /tmp/infrahub-helm + name: git-data + - hostPath: + path: /tmp/infrahub-helm + name: git-remote-data + - name: infrahub-config-volume + configMap: + name: {{ .Chart.Name }}-{{ .Chart.AppVersion }}-configmap \ No newline at end of file diff --git a/helm/templates/infrahub-server-ingress.yaml b/helm/templates/infrahub-server-ingress.yaml new file mode 100644 index 0000000000..585b474a91 --- /dev/null +++ b/helm/templates/infrahub-server-ingress.yaml @@ -0,0 +1,18 @@ +apiVersion: networking.k8s.io/v1 +kind: Ingress +metadata: + name: {{ include "infrahub-helm.fullname" . }}-infrahub-server-ingress + labels: + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + rules: + - host: infrahub-{{ .Values.kubernetesClusterDomain }} + http: + paths: + - path: / + pathType: Prefix + backend: + service: + name: {{ include "infrahub-helm.fullname" . }}-infrahub-server + port: + number: {{ (index .Values.infrahubServer.ports 0).port }} diff --git a/helm/templates/infrahub-server.yaml b/helm/templates/infrahub-server.yaml new file mode 100644 index 0000000000..6490e24739 --- /dev/null +++ b/helm/templates/infrahub-server.yaml @@ -0,0 +1,161 @@ +--- +apiVersion: apps/v1 +kind: Deployment +metadata: + name: {{ include "infrahub-helm.fullname" . }}-infrahub-server + labels: + io.kompose.service: infrahub-server + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + replicas: {{ .Values.infrahubServer.replicas }} + selector: + matchLabels: + io.kompose.service: infrahub-server + {{- include "infrahub-helm.selectorLabels" . | nindent 6 }} + template: + metadata: + labels: + io.kompose.service: infrahub-server + {{- include "infrahub-helm.selectorLabels" . | nindent 8 }} + spec: + containers: + - args: {{- toYaml .Values.infrahubServer.infrahubServer.args | nindent 8 }} + env: + - name: INFRAHUB_ALLOW_ANONYMOUS_ACCESS + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubAllowAnonymousAccess }} + - name: INFRAHUB_CACHE_PORT + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubCachePort }} + - name: INFRAHUB_CONFIG + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubConfig }} + - name: INFRAHUB_DB_TYPE + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubDbType }} + - name: INFRAHUB_FRONTEND_DIRECTORY + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubFrontendDirectory }} + - name: INFRAHUB_LOG_LEVEL + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubLogLevel }} + - name: INFRAHUB_PRODUCTION + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubProduction }} + - name: INFRAHUB_SECURITY_INITIAL_ADMIN_TOKEN + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubSecurityInitialAdminToken }} + - name: INFRAHUB_SECURITY_SECRET_KEY + value: {{ quote .Values.infrahubServer.infrahubServer.env.infrahubSecuritySecretKey }} + - name: PROMETHEUS_MULTIPROC_DIR + value: {{ quote .Values.infrahubServer.infrahubServer.env.prometheusMultiprocDir }} + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ .Values.kubernetesClusterDomain }} + image: {{ .Values.infrahubServer.infrahubServer.image.repository }}:{{ .Values.infrahubServer.infrahubServer.image.tag | default .Chart.AppVersion }} + imagePullPolicy: {{ .Values.infrahubServer.infrahubServer.imagePullPolicy }} + livenessProbe: + exec: + command: + - sh + - -c + - wget -O /dev/null http://localhost:{{ (index .Values.infrahubServer.ports 0).port }}/api/schema || exit 1 + failureThreshold: 20 + initialDelaySeconds: 10 + periodSeconds: 5 + timeoutSeconds: 5 + name: infrahub-server + ports: + {{- range .Values.infrahubServer.ports }} + {{- if eq .name "infrahub-gui" }} + - containerPort: {{ .targetPort }} + hostPort: {{ .port }} + protocol: TCP + {{- end }} + {{- end }} + resources: {} + tty: true + volumeMounts: + - mountPath: /opt/infrahub/storage + name: infrahub-server-storage-data + - name: infrahub-config-volume + mountPath: /config + initContainers: + - command: + - sh + - -c + - until nslookup {{ include "infrahub-helm.fullname" . }}-database.$(cat /var/run/secrets/kubernetes.io/serviceaccount/namespace).svc.{{ .Values.kubernetesClusterDomain }}; + do echo waiting for {{ include "infrahub-helm.fullname" . }}-database; sleep 2; done; + env: + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ quote .Values.kubernetesClusterDomain }} + image: {{ .Values.infrahubServer.waitForDatabase.image.repository }}:{{ .Values.infrahubServer.waitForDatabase.image.tag | default .Chart.AppVersion }} + name: wait-for-database + resources: {} + - command: + - sh + - -c + - until nslookup {{ include "infrahub-helm.fullname" . }}-message-queue.$(cat /var/run/secrets/kubernetes.io/serviceaccount/namespace).svc.{{ .Values.kubernetesClusterDomain }}; + do echo waiting for {{ include "infrahub-helm.fullname" . }}-message-queue; sleep 2; done; + env: + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ quote .Values.kubernetesClusterDomain }} + image: {{ .Values.infrahubServer.waitForMessageQueue.image.repository }}:{{ .Values.infrahubServer.waitForMessageQueue.image.tag | default .Chart.AppVersion }} + name: wait-for-message-queue + resources: {} + - command: + - sh + - -c + - until nslookup {{ include "infrahub-helm.fullname" . }}-cache.$(cat /var/run/secrets/kubernetes.io/serviceaccount/namespace).svc.{{ .Values.kubernetesClusterDomain }}; + do echo waiting for {{ include "infrahub-helm.fullname" . }}-cache; sleep 2; done; + env: + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ quote .Values.kubernetesClusterDomain }} + image: {{ .Values.infrahubServer.waitForCache.image.repository }}:{{ .Values.infrahubServer.waitForCache.image.tag | default .Chart.AppVersion }} + name: wait-for-cache + resources: {} + restartPolicy: Always + volumes: + - hostPath: + path: /tmp/infrahub-helm + name: infrahub-server-storage-data + - name: infrahub-config-volume + configMap: + name: {{ .Chart.Name }}-{{ .Chart.AppVersion }}-configmap + +--- +apiVersion: v1 +kind: Service +metadata: + name: {{ include "infrahub-helm.fullname" . }}-infrahub-server + labels: + io.kompose.service: infrahub-server + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + type: {{ .Values.infrahubServer.type }} + selector: + io.kompose.service: infrahub-server + {{- include "infrahub-helm.selectorLabels" . | nindent 4 }} + ports: + {{- .Values.infrahubServer.ports | toYaml | nindent 2 }} + +--- +apiVersion: v1 +kind: PersistentVolumeClaim +metadata: + name: {{ include "infrahub-helm.fullname" . }}-infrahub-server-claim1 + labels: + io.kompose.service: infrahub-server-claim1 + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + accessModes: + - ReadWriteOnce + resources: + requests: + storage: {{ .Values.pvc.infrahubServerClaim1.storageRequest | quote }} + +--- +apiVersion: v1 +kind: PersistentVolumeClaim +metadata: + name: {{ include "infrahub-helm.fullname" . }}-infrahub-server-storage-data + labels: + io.kompose.service: infrahub-server-storage-data + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + accessModes: + - ReadWriteOnce + resources: + requests: + storage: {{ .Values.pvc.infrahubServerStorageData.storageRequest | quote }} diff --git a/helm/templates/job-viewer-role.yaml b/helm/templates/job-viewer-role.yaml new file mode 100644 index 0000000000..d345eb9f5b --- /dev/null +++ b/helm/templates/job-viewer-role.yaml @@ -0,0 +1,29 @@ +apiVersion: rbac.authorization.k8s.io/v1 +kind: Role +metadata: + name: {{ include "infrahub-helm.fullname" . }}-job-viewer + labels: + {{- include "infrahub-helm.labels" . | nindent 4 }} +rules: +- apiGroups: + - batch + resources: + - jobs + verbs: + - get + +--- +apiVersion: rbac.authorization.k8s.io/v1 +kind: RoleBinding +metadata: + name: {{ include "infrahub-helm.fullname" . }}-job-viewer-binding + labels: + {{- include "infrahub-helm.labels" . | nindent 4 }} +roleRef: + apiGroup: rbac.authorization.k8s.io + kind: Role + name: '{{ include "infrahub-helm.fullname" . }}-job-viewer' +subjects: +- kind: ServiceAccount + name: default + namespace: '{{ .Release.Namespace }}' diff --git a/helm/templates/message-queue.yaml b/helm/templates/message-queue.yaml new file mode 100644 index 0000000000..2d362cbcca --- /dev/null +++ b/helm/templates/message-queue.yaml @@ -0,0 +1,83 @@ +apiVersion: apps/v1 +kind: Deployment +metadata: + name: {{ include "infrahub-helm.fullname" . }}-message-queue + labels: + io.kompose.service: message-queue + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + replicas: {{ .Values.messageQueue.replicas }} + selector: + matchLabels: + io.kompose.service: message-queue + {{- include "infrahub-helm.selectorLabels" . | nindent 6 }} + template: + metadata: + labels: + io.kompose.service: message-queue + {{- include "infrahub-helm.selectorLabels" . | nindent 8 }} + spec: + containers: + - env: + - name: RABBITMQ_DEFAULT_PASS + value: {{ quote .Values.messageQueue.messageQueue.env.rabbitmqDefaultPass }} + - name: RABBITMQ_DEFAULT_USER + value: {{ quote .Values.messageQueue.messageQueue.env.rabbitmqDefaultUser }} + - name: KUBERNETES_CLUSTER_DOMAIN + value: {{ quote .Values.kubernetesClusterDomain }} + image: {{ .Values.messageQueue.messageQueue.image.repository }}:{{ .Values.messageQueue.messageQueue.image.tag }} + livenessProbe: + exec: + command: + - sh + - -c + - rabbitmq-diagnostics -q ping + failureThreshold: 3 + periodSeconds: 5 + timeoutSeconds: 30 + name: message-queue + ports: + - containerPort: 4369 + hostPort: 4369 + protocol: TCP + - containerPort: 5671 + hostPort: 5671 + protocol: TCP + - containerPort: 5672 + hostPort: 5672 + protocol: TCP + - containerPort: 15671 + hostPort: 15671 + protocol: TCP + - containerPort: 15672 + hostPort: 15672 + name: management + protocol: TCP + - containerPort: 15691 + hostPort: 15691 + protocol: TCP + - containerPort: 15692 + hostPort: 15692 + name: metrics + protocol: TCP + - containerPort: 2567 + hostPort: 2567 + protocol: TCP + resources: {} + restartPolicy: Always + +--- +apiVersion: v1 +kind: Service +metadata: + name: {{ include "infrahub-helm.fullname" . }}-message-queue + labels: + io.kompose.service: message-queue + {{- include "infrahub-helm.labels" . | nindent 4 }} +spec: + type: {{ .Values.messageQueue.type }} + selector: + io.kompose.service: message-queue + {{- include "infrahub-helm.selectorLabels" . | nindent 4 }} + ports: + {{- .Values.messageQueue.ports | toYaml | nindent 2 -}} diff --git a/helm/values.yaml b/helm/values.yaml new file mode 100644 index 0000000000..a92393092d --- /dev/null +++ b/helm/values.yaml @@ -0,0 +1,171 @@ +cache: + cache: + image: + repository: redis + tag: "7.2" + ports: + - name: "client" + port: 6379 + targetPort: 6379 + - name: "gossip" + port: 16379 + targetPort: 16379 + replicas: 1 + type: ClusterIP +database: + database: + env: + neo4JAcceptLicenseAgreement: "yes" + neo4JAuth: neo4j/admin + neo4JDbmsSecurityAuthMinimumPasswordLength: "4" + neo4JDbmsSecurityProceduresUnrestricted: apoc.* + neo4JServerMetricsPrometheusEnabled: "true" + neo4JServerMetricsPrometheusEndpoint: 0.0.0.0:2004 + image: + repository: neo4j + tag: 5.11-enterprise + ports: + - name: "metrics" + port: 2004 + targetPort: 2004 + - name: "interface" + port: 7474 + targetPort: 7474 + - name: "bolt" + port: 7687 + targetPort: 7687 + replicas: 1 + type: ClusterIP +infrahubGit: + infrahubGit: + args: + - infrahub + - git-agent + - start + - --debug + env: + infrahubAddress: http://infrahub-server:8000 + infrahubCachePort: 6379 + infrahubConfig: /config/infrahub.toml + infrahubDbType: neo4j + infrahubLogLevel: DEBUG + infrahubProduction: "false" + infrahubSdkApiToken: 06438eb2-8019-4776-878c-0941b1f1d1ec + infrahubSdkTimeout: "20" + image: + repository: 9r2s1098.c1.gra9.container-registry.ovh.net/opsmill/infrahub + # tag: 0.8.2 + imagePullPolicy: Always + replicas: 2 + waitForInfrahubServer: + image: + repository: busybox + tag: latest +infrahubServer: + infrahubServer: + args: + - gunicorn + - --config + - /source/backend/infrahub/serve/gunicorn_config.py + - --logger-class + - infrahub.serve.log.GunicornLogger + - infrahub.server:app + env: + infrahubAllowAnonymousAccess: "true" + infrahubCachePort: 6379 + infrahubConfig: /config/infrahub.toml + infrahubDbType: neo4j + infrahubFrontendDirectory: /opt/infrahub/frontend + infrahubLogLevel: INFO + infrahubProduction: "false" + infrahubSecurityInitialAdminToken: 06438eb2-8019-4776-878c-0941b1f1d1ec + infrahubSecuritySecretKey: 327f747f-efac-42be-9e73-999f08f86b92 + prometheusMultiprocDir: /prom_shared + image: + repository: 9r2s1098.c1.gra9.container-registry.ovh.net/opsmill/infrahub + # tag: 0.8.2 + imagePullPolicy: Always + ports: + - name: "infrahub-gui" + port: 8000 + targetPort: 8000 + replicas: 1 + type: ClusterIP + waitForCache: + image: + repository: busybox + tag: latest + waitForDatabase: + image: + repository: busybox + tag: latest + waitForMessageQueue: + image: + repository: busybox + tag: latest +infrahubServerDbInitJob: + backoffLimit: 0 + infrahubServerDbInitJob: + env: + infrahubCachePort: 6379 + infrahubConfig: /config/infrahub.toml + image: + repository: 9r2s1098.c1.gra9.container-registry.ovh.net/opsmill/infrahub + # tag: 0.8.2 + waitForDatabase: + image: + repository: busybox + tag: latest +kubernetesClusterDomain: cluster.local +messageQueue: + messageQueue: + env: + rabbitmqDefaultPass: infrahub + rabbitmqDefaultUser: infrahub + image: + repository: rabbitmq + tag: 3.12-management + ports: + - name: "4369" + port: 4369 + targetPort: 4369 + - name: "5671" + port: 5671 + targetPort: 5671 + - name: "5672" + port: 5672 + targetPort: 5672 + - name: "15671" + port: 15671 + targetPort: 15671 + - name: "15672" + port: 15672 + targetPort: 15672 + - name: "15691" + port: 15691 + targetPort: 15691 + - name: "15692" + port: 15692 + targetPort: 15692 + - name: "2567" + port: 2567 + targetPort: 2567 + replicas: 1 + type: ClusterIP +pvc: + databaseClaim0: + storageRequest: 100Mi + databaseData: + storageRequest: 100Mi + databaseLogs: + storageRequest: 100Mi + gitData: + storageRequest: 100Mi + gitRemoteData: + storageRequest: 100Mi + infrahubGitClaim2: + storageRequest: 100Mi + infrahubServerClaim1: + storageRequest: 100Mi + infrahubServerStorageData: + storageRequest: 100Mi From cb88ab7b90a1f94276c05c2ffee77cf0a0896034 Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Tue, 21 Nov 2023 16:03:45 +0100 Subject: [PATCH 117/446] add ReadMe --- helm/README.md | 72 ++++++++++++++++++++++++++++++++++++++++++++++++ helm/values.yaml | 3 -- 2 files changed, 72 insertions(+), 3 deletions(-) create mode 100644 helm/README.md diff --git a/helm/README.md b/helm/README.md new file mode 100644 index 0000000000..59b4b64c93 --- /dev/null +++ b/helm/README.md @@ -0,0 +1,72 @@ +# Infrahub Helm Chart + +## Description +This Helm chart deploys Infrahub on Kubernetes. It provides configurable templates for various Kubernetes resources including caching, databases, and message queues, ensuring a scalable and efficient deployment of the Infrahub application. + +## Chart Structure +The chart includes the following files and directories: +- `Chart.yaml`: Chart metadata file. +- `templates/`: Contains the template files for Kubernetes resources. + - `_helpers.tpl`: Template helpers/definitions. + - `cache.yaml`: Defines the cache (Redis) deployment, service, and PVC. + - `configmap.yaml`: ConfigMap for application configuration. + - `database.yaml`: Database (Neo4j) deployment, service, and PVC. + - `infrahub-git.yaml`: Infrahub Git service deployment and service. + - `infrahub-server-db-init-job.yaml`: Initialization jobs for Infrahub Server. + - `infrahub-server-ingress.yaml`: Ingress configuration for Infrahub Server. + - `infrahub-server.yaml`: Infrahub Server deployment and service. + - `job-viewer-role.yaml`: Defines roles and permissions. + - `message-queue.yaml`: Message Queue (RabbitMQ) deployment and service. +- `values.yaml`: Defines configuration values for the chart. + +## ConfigMap for Infrahub Configuration +The `configmap.yaml` file defines a ConfigMap that includes the `infrahub.toml` configuration file. This approach avoids the need to load the configuration from a host path, making the deployment more portable and cloud-friendly. + +The ConfigMap is structured as follows: +- The `internal_address` is dynamically set based on the release name, namespace, and cluster domain. +- Database, broker, cache, and other service addresses are set dynamically, referring to the relevant services within the Kubernetes cluster. +- Ports for services like the database and cache are pulled from the `values.yaml` file, ensuring flexibility and ease of configuration changes. + + +## Prerequisites +- Kubernetes 1.12+ +- Helm 3.0+ +- PV provisioner support in the underlying infrastructure (if persistence is required) + +## Installing the Chart +To install the chart with the release name `my-infrahub-release`: + +helm install my-infrahub-release path/to/infrahub/chart + +## Configuration +The following table lists the configurable parameters in the `values.yaml` file and their default values. + +| Parameter | Description | Default | +| --------- | ----------- | ------- | +| `cache.cache.image.repository` | The Redis image repository | `redis` | +| `cache.cache.image.tag` | The Redis image tag | `"7.2"` | +| `database.database.image.repository` | The Neo4j image repository | `neo4j` | +| `database.database.image.tag` | The Neo4j image tag | `5.11-enterprise` | +| `infrahubGit.infrahubGit.image.repository` | The Infrahub Git image repository | `9r2s1098.c1.gra9.container-registry.ovh.net/opsmill/infrahub` | +| `infrahubServer.infrahubServer.image.repository` | The Infrahub Server image repository | `9r2s1098.c1.gra9.container-registry.ovh.net/opsmill/infrahub` | +| `messageQueue.messageQueue.image.repository` | The RabbitMQ image repository | `rabbitmq` | +| `messageQueue.messageQueue.image.tag` | The RabbitMQ image tag | `3.12-management` | +| ... | ... | ... | + +For more detailed configuration and additional parameters, refer to the `values.yaml` file. + +## Upgrading the Chart +To upgrade the chart to a new version: + +helm upgrade my-infrahub-release path/to/infrahub/chart + +## Uninstalling the Chart +To uninstall/delete the `my-infrahub-release` deployment: + +helm delete my-infrahub-release + +## Persistence +The chart offers the ability to configure persistence for the database and other components. Check the `pvc` section in `values.yaml` for more details. + +## Customization +The chart is customizable through `values.yaml`. For more complex customizations and usage scenarios, refer to the official Helm documentation. diff --git a/helm/values.yaml b/helm/values.yaml index a92393092d..0b27cd906f 100644 --- a/helm/values.yaml +++ b/helm/values.yaml @@ -54,7 +54,6 @@ infrahubGit: infrahubSdkTimeout: "20" image: repository: 9r2s1098.c1.gra9.container-registry.ovh.net/opsmill/infrahub - # tag: 0.8.2 imagePullPolicy: Always replicas: 2 waitForInfrahubServer: @@ -83,7 +82,6 @@ infrahubServer: prometheusMultiprocDir: /prom_shared image: repository: 9r2s1098.c1.gra9.container-registry.ovh.net/opsmill/infrahub - # tag: 0.8.2 imagePullPolicy: Always ports: - name: "infrahub-gui" @@ -111,7 +109,6 @@ infrahubServerDbInitJob: infrahubConfig: /config/infrahub.toml image: repository: 9r2s1098.c1.gra9.container-registry.ovh.net/opsmill/infrahub - # tag: 0.8.2 waitForDatabase: image: repository: busybox From 70c178e096e6e40c4ad13c3580432edd081a8a2a Mon Sep 17 00:00:00 2001 From: Benoit Kohler Date: Tue, 21 Nov 2023 16:10:24 +0100 Subject: [PATCH 118/446] closer to linter, but still some error with templating --- .yamllint.yml | 1 + helm/Chart.yaml | 1 + helm/templates/cache.yaml | 1 + helm/templates/configmap.yaml | 1 + helm/templates/database.yaml | 1 + helm/templates/infrahub-git.yaml | 1 + .../infrahub-server-db-init-job.yaml | 1 + helm/templates/infrahub-server-ingress.yaml | 1 + helm/templates/job-viewer-role.yaml | 1 + helm/templates/message-queue.yaml | 1 + helm/values.yaml | 105 +++++++++--------- 11 files changed, 63 insertions(+), 52 deletions(-) diff --git a/.yamllint.yml b/.yamllint.yml index beb797f226..d81b8e147a 100644 --- a/.yamllint.yml +++ b/.yamllint.yml @@ -7,6 +7,7 @@ ignore: | /repositories /frontend/node_modules /node_modules + /helm rules: new-lines: disable diff --git a/helm/Chart.yaml b/helm/Chart.yaml index 4920d7bdfc..eb3ed11676 100644 --- a/helm/Chart.yaml +++ b/helm/Chart.yaml @@ -1,3 +1,4 @@ +--- apiVersion: v2 name: infrahub description: A Helm chart to deploy Infrahub on Kubernetes diff --git a/helm/templates/cache.yaml b/helm/templates/cache.yaml index a5df3606a0..8b67c9824c 100644 --- a/helm/templates/cache.yaml +++ b/helm/templates/cache.yaml @@ -1,3 +1,4 @@ +--- apiVersion: apps/v1 kind: Deployment metadata: diff --git a/helm/templates/configmap.yaml b/helm/templates/configmap.yaml index 20ca7e6880..fee94f2f08 100644 --- a/helm/templates/configmap.yaml +++ b/helm/templates/configmap.yaml @@ -1,3 +1,4 @@ +--- apiVersion: v1 kind: ConfigMap metadata: diff --git a/helm/templates/database.yaml b/helm/templates/database.yaml index 2622d6b174..d97be2ec87 100644 --- a/helm/templates/database.yaml +++ b/helm/templates/database.yaml @@ -1,3 +1,4 @@ +--- apiVersion: apps/v1 kind: Deployment metadata: diff --git a/helm/templates/infrahub-git.yaml b/helm/templates/infrahub-git.yaml index 2b53fc88e4..133173b375 100644 --- a/helm/templates/infrahub-git.yaml +++ b/helm/templates/infrahub-git.yaml @@ -1,3 +1,4 @@ +--- apiVersion: apps/v1 kind: Deployment metadata: diff --git a/helm/templates/infrahub-server-db-init-job.yaml b/helm/templates/infrahub-server-db-init-job.yaml index 5adc8fe2ef..63ba3698ab 100644 --- a/helm/templates/infrahub-server-db-init-job.yaml +++ b/helm/templates/infrahub-server-db-init-job.yaml @@ -1,3 +1,4 @@ +--- apiVersion: batch/v1 kind: Job metadata: diff --git a/helm/templates/infrahub-server-ingress.yaml b/helm/templates/infrahub-server-ingress.yaml index 585b474a91..b48a33e71c 100644 --- a/helm/templates/infrahub-server-ingress.yaml +++ b/helm/templates/infrahub-server-ingress.yaml @@ -1,3 +1,4 @@ +--- apiVersion: networking.k8s.io/v1 kind: Ingress metadata: diff --git a/helm/templates/job-viewer-role.yaml b/helm/templates/job-viewer-role.yaml index d345eb9f5b..cb736132e2 100644 --- a/helm/templates/job-viewer-role.yaml +++ b/helm/templates/job-viewer-role.yaml @@ -1,3 +1,4 @@ +--- apiVersion: rbac.authorization.k8s.io/v1 kind: Role metadata: diff --git a/helm/templates/message-queue.yaml b/helm/templates/message-queue.yaml index 2d362cbcca..618d40b0fb 100644 --- a/helm/templates/message-queue.yaml +++ b/helm/templates/message-queue.yaml @@ -1,3 +1,4 @@ +--- apiVersion: apps/v1 kind: Deployment metadata: diff --git a/helm/values.yaml b/helm/values.yaml index 0b27cd906f..6cb97c1858 100644 --- a/helm/values.yaml +++ b/helm/values.yaml @@ -1,15 +1,16 @@ +--- cache: cache: image: repository: redis tag: "7.2" ports: - - name: "client" - port: 6379 - targetPort: 6379 - - name: "gossip" - port: 16379 - targetPort: 16379 + - name: "client" + port: 6379 + targetPort: 6379 + - name: "gossip" + port: 16379 + targetPort: 16379 replicas: 1 type: ClusterIP database: @@ -25,24 +26,24 @@ database: repository: neo4j tag: 5.11-enterprise ports: - - name: "metrics" - port: 2004 - targetPort: 2004 - - name: "interface" - port: 7474 - targetPort: 7474 - - name: "bolt" - port: 7687 - targetPort: 7687 + - name: "metrics" + port: 2004 + targetPort: 2004 + - name: "interface" + port: 7474 + targetPort: 7474 + - name: "bolt" + port: 7687 + targetPort: 7687 replicas: 1 type: ClusterIP infrahubGit: infrahubGit: args: - - infrahub - - git-agent - - start - - --debug + - infrahub + - git-agent + - start + - --debug env: infrahubAddress: http://infrahub-server:8000 infrahubCachePort: 6379 @@ -63,12 +64,12 @@ infrahubGit: infrahubServer: infrahubServer: args: - - gunicorn - - --config - - /source/backend/infrahub/serve/gunicorn_config.py - - --logger-class - - infrahub.serve.log.GunicornLogger - - infrahub.server:app + - gunicorn + - --config + - /source/backend/infrahub/serve/gunicorn_config.py + - --logger-class + - infrahub.serve.log.GunicornLogger + - infrahub.server:app env: infrahubAllowAnonymousAccess: "true" infrahubCachePort: 6379 @@ -84,9 +85,9 @@ infrahubServer: repository: 9r2s1098.c1.gra9.container-registry.ovh.net/opsmill/infrahub imagePullPolicy: Always ports: - - name: "infrahub-gui" - port: 8000 - targetPort: 8000 + - name: "infrahub-gui" + port: 8000 + targetPort: 8000 replicas: 1 type: ClusterIP waitForCache: @@ -123,30 +124,30 @@ messageQueue: repository: rabbitmq tag: 3.12-management ports: - - name: "4369" - port: 4369 - targetPort: 4369 - - name: "5671" - port: 5671 - targetPort: 5671 - - name: "5672" - port: 5672 - targetPort: 5672 - - name: "15671" - port: 15671 - targetPort: 15671 - - name: "15672" - port: 15672 - targetPort: 15672 - - name: "15691" - port: 15691 - targetPort: 15691 - - name: "15692" - port: 15692 - targetPort: 15692 - - name: "2567" - port: 2567 - targetPort: 2567 + - name: "4369" + port: 4369 + targetPort: 4369 + - name: "5671" + port: 5671 + targetPort: 5671 + - name: "5672" + port: 5672 + targetPort: 5672 + - name: "15671" + port: 15671 + targetPort: 15671 + - name: "15672" + port: 15672 + targetPort: 15672 + - name: "15691" + port: 15691 + targetPort: 15691 + - name: "15692" + port: 15692 + targetPort: 15692 + - name: "2567" + port: 2567 + targetPort: 2567 replicas: 1 type: ClusterIP pvc: From e76cdb251ef19be382c3793eea801c7542a4809c Mon Sep 17 00:00:00 2001 From: Patrick Ogenstad Date: Tue, 21 Nov 2023 16:49:56 +0100 Subject: [PATCH 119/446] Remove serve.log from excluded mypy checks --- backend/infrahub/serve/log.py | 16 +++++++++------- pyproject.toml | 3 --- 2 files changed, 9 insertions(+), 10 deletions(-) diff --git a/backend/infrahub/serve/log.py b/backend/infrahub/serve/log.py index 15f1959ff8..18c4f06a25 100644 --- a/backend/infrahub/serve/log.py +++ b/backend/infrahub/serve/log.py @@ -1,27 +1,29 @@ +from typing import Any + from gunicorn.glogging import Logger from infrahub.log import get_logger class GunicornLogger(Logger): - def __init__(self, cfg): + def __init__(self, cfg: Any): super().__init__(cfg) self.logger = get_logger("gunicorn") - def critical(self, msg, *args, **kwargs): + def critical(self, msg: str, *args: Any, **kwargs: Any) -> None: self.logger.critical(msg, *args, **kwargs) - def error(self, msg, *args, **kwargs): + def error(self, msg: str, *args: Any, **kwargs: Any) -> None: self.logger.error(msg, *args, **kwargs) - def warning(self, msg, *args, **kwargs): + def warning(self, msg: str, *args: Any, **kwargs: Any) -> None: self.logger.warning(msg, *args, **kwargs) - def info(self, msg, *args, **kwargs): + def info(self, msg: str, *args: Any, **kwargs: Any) -> None: self.logger.info(msg, *args, **kwargs) - def debug(self, msg, *args, **kwargs): + def debug(self, msg: str, *args: Any, **kwargs: Any) -> None: self.logger.debug(msg, *args, **kwargs) - def exception(self, msg, *args, **kwargs): + def exception(self, msg: str, *args: Any, **kwargs: Any) -> None: self.logger.exception(msg, *args, **kwargs) diff --git a/pyproject.toml b/pyproject.toml index 1ed3c3b8df..44d0e527a7 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -205,9 +205,6 @@ ignore_errors = true module = "infrahub.message_bus.rpc" ignore_errors = true -[[tool.mypy.overrides]] -module = "infrahub.serve.log" -disallow_untyped_defs = false [[tool.mypy.overrides]] module = "infrahub.server" From 28a1080e805824fc51d885adcf369ce096b61a94 Mon Sep 17 00:00:00 2001 From: Bilal Date: Tue, 21 Nov 2023 17:14:33 +0100 Subject: [PATCH 120/446] fix typo isOptionnal > isOptional --- frontend/src/components-form/checkbox.register.tsx | 2 +- frontend/src/components-form/checkbox.tsx | 6 +++--- frontend/src/components-form/date-picker.tsx | 6 +++--- frontend/src/components-form/input.register.tsx | 2 +- frontend/src/components-form/input.tsx | 6 +++--- frontend/src/components-form/select-2-step.register.tsx | 2 +- frontend/src/components-form/select-2-step.tsx | 6 +++--- frontend/src/components-form/select.register.tsx | 2 +- frontend/src/components-form/select.tsx | 8 ++++---- frontend/src/components-form/switch.register.tsx | 2 +- frontend/src/components-form/switch.tsx | 6 +++--- frontend/src/components-form/textarea.register.tsx | 2 +- frontend/src/components-form/textarea.tsx | 6 +++--- .../src/screens/edit-form-hook/dynamic-control-types.ts | 2 +- frontend/src/utils/formStructureForCreateEdit.ts | 4 ++-- frontend/tests/mocks/data/accountToken.ts | 8 ++++---- 16 files changed, 35 insertions(+), 35 deletions(-) diff --git a/frontend/src/components-form/checkbox.register.tsx b/frontend/src/components-form/checkbox.register.tsx index 61b30117f3..79144ec419 100644 --- a/frontend/src/components-form/checkbox.register.tsx +++ b/frontend/src/components-form/checkbox.register.tsx @@ -10,7 +10,7 @@ interface Props { config?: RegisterOptions | undefined; setValue: UseFormSetValue; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; disabled?: boolean; error?: FormFieldError; } diff --git a/frontend/src/components-form/checkbox.tsx b/frontend/src/components-form/checkbox.tsx index 7dd6a349a7..0f41d13440 100644 --- a/frontend/src/components-form/checkbox.tsx +++ b/frontend/src/components-form/checkbox.tsx @@ -9,18 +9,18 @@ interface Props { onChange: (value: boolean) => void; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; } export default function OpsCheckbox(props: Props) { - const { label, onChange, value, error, isProtected, isOptionnal } = props; + const { label, onChange, value, error, isProtected, isOptional } = props; const [enabled, setEnabled] = useState(value); return (
{isProtected ? : null}
diff --git a/frontend/src/components-form/date-picker.tsx b/frontend/src/components-form/date-picker.tsx index 130f725b11..22694c2818 100644 --- a/frontend/src/components-form/date-picker.tsx +++ b/frontend/src/components-form/date-picker.tsx @@ -8,16 +8,16 @@ type OpsDatePickerProps = { onChange: (value?: Date) => void; className?: string; error?: FormFieldError; - isOptionnal?: boolean; + isOptional?: boolean; }; export const OpsDatePicker = (props: OpsDatePickerProps) => { - const { className, onChange, value, label, error, isOptionnal } = props; + const { className, onChange, value, label, error, isOptional } = props; return ( <> ; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; disabled?: boolean; } diff --git a/frontend/src/components-form/input.tsx b/frontend/src/components-form/input.tsx index 101cc1a914..5a339e4f61 100644 --- a/frontend/src/components-form/input.tsx +++ b/frontend/src/components-form/input.tsx @@ -11,7 +11,7 @@ type OpsInputProps = { error?: FormFieldError; type: string; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; isUnique?: boolean; disabled?: boolean; }; @@ -19,14 +19,14 @@ type OpsInputProps = { const InputUniqueTips = () => must be unique; export const OpsInput = (props: OpsInputProps) => { - const { className, onChange, value, label, error, isProtected, isOptionnal, isUnique, disabled } = + const { className, onChange, value, label, error, isProtected, isOptional, isUnique, disabled } = props; return ( <>
{isProtected && } {isUnique && } diff --git a/frontend/src/components-form/select-2-step.register.tsx b/frontend/src/components-form/select-2-step.register.tsx index b92f27bf3d..5069f334b7 100644 --- a/frontend/src/components-form/select-2-step.register.tsx +++ b/frontend/src/components-form/select-2-step.register.tsx @@ -13,7 +13,7 @@ interface Props { setValue: UseFormSetValue; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; disabled?: boolean; } diff --git a/frontend/src/components-form/select-2-step.tsx b/frontend/src/components-form/select-2-step.tsx index 743ca3337d..f9e0287924 100644 --- a/frontend/src/components-form/select-2-step.tsx +++ b/frontend/src/components-form/select-2-step.tsx @@ -22,11 +22,11 @@ interface Props { onChange: (value: iTwoStepDropdownData) => void; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; } export const OpsSelect2Step = (props: Props) => { - const { label, options, value, error, onChange, isProtected, isOptionnal } = props; + const { label, options, value, error, onChange, isProtected, isOptional } = props; const { objectid } = useParams(); const branch = useReactiveVar(branchVar); @@ -104,7 +104,7 @@ export const OpsSelect2Step = (props: Props) => {
diff --git a/frontend/src/components-form/select.register.tsx b/frontend/src/components-form/select.register.tsx index 27969ffc1a..765832e8f9 100644 --- a/frontend/src/components-form/select.register.tsx +++ b/frontend/src/components-form/select.register.tsx @@ -13,7 +13,7 @@ type SelectRegisterProps = { setValue: UseFormSetValue; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; disabled?: boolean; }; diff --git a/frontend/src/components-form/select.tsx b/frontend/src/components-form/select.tsx index be8be811e9..3546c6083c 100644 --- a/frontend/src/components-form/select.tsx +++ b/frontend/src/components-form/select.tsx @@ -10,18 +10,18 @@ type SelectProps = { onChange: (value: SelectOption) => void; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; }; export const OpsSelect = (props: SelectProps) => { - const { label, isProtected, isOptionnal, ...propsToPass } = props; + const { label, isProtected, isOptional, ...propsToPass } = props; const getLabel = () => { - if (label && isOptionnal) { + if (label && isOptional) { return label; } - if (label && !isOptionnal) { + if (label && !isOptional) { return `${label} *`; } diff --git a/frontend/src/components-form/switch.register.tsx b/frontend/src/components-form/switch.register.tsx index b997e71e03..f5233aa85d 100644 --- a/frontend/src/components-form/switch.register.tsx +++ b/frontend/src/components-form/switch.register.tsx @@ -12,7 +12,7 @@ interface Props { onChange?: Function; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; disabled?: boolean; } diff --git a/frontend/src/components-form/switch.tsx b/frontend/src/components-form/switch.tsx index 6e859c3e65..15f1d8d48b 100644 --- a/frontend/src/components-form/switch.tsx +++ b/frontend/src/components-form/switch.tsx @@ -9,18 +9,18 @@ interface Props { onChange: (value: boolean) => void; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; } export default function OpsSwitch(props: Props) { - const { label, onChange, value, error, isProtected, isOptionnal } = props; + const { label, onChange, value, error, isProtected, isOptional } = props; const [enabled, setEnabled] = useState(value); return (
{isProtected ? : null}
diff --git a/frontend/src/components-form/textarea.register.tsx b/frontend/src/components-form/textarea.register.tsx index 2623edd1cf..d8dc381477 100644 --- a/frontend/src/components-form/textarea.register.tsx +++ b/frontend/src/components-form/textarea.register.tsx @@ -11,7 +11,7 @@ interface Props { setValue: UseFormSetValue; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; disabled?: boolean; } diff --git a/frontend/src/components-form/textarea.tsx b/frontend/src/components-form/textarea.tsx index e0d83b7da1..f9f8e706e0 100644 --- a/frontend/src/components-form/textarea.tsx +++ b/frontend/src/components-form/textarea.tsx @@ -10,18 +10,18 @@ type OpsInputProps = { className?: string; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; disabled?: boolean; }; export const OpsTextarea = (props: OpsInputProps) => { - const { className, onChange, value, label, error, isProtected, isOptionnal, disabled } = props; + const { className, onChange, value, label, error, isProtected, isOptional, disabled } = props; return ( <>
{isProtected ? : null}
diff --git a/frontend/src/screens/edit-form-hook/dynamic-control-types.ts b/frontend/src/screens/edit-form-hook/dynamic-control-types.ts index b613d37702..590232596d 100644 --- a/frontend/src/screens/edit-form-hook/dynamic-control-types.ts +++ b/frontend/src/screens/edit-form-hook/dynamic-control-types.ts @@ -93,7 +93,7 @@ export interface DynamicFieldData { config?: RegisterOptions; error?: FormFieldError; isProtected?: boolean; - isOptionnal?: boolean; + isOptional?: boolean; isReadOnly?: boolean; isUnique?: boolean; disabled?: boolean; diff --git a/frontend/src/utils/formStructureForCreateEdit.ts b/frontend/src/utils/formStructureForCreateEdit.ts index 07bdf7bd59..e3cdc0d12b 100644 --- a/frontend/src/utils/formStructureForCreateEdit.ts +++ b/frontend/src/utils/formStructureForCreateEdit.ts @@ -121,7 +121,7 @@ const getFormStructureForCreateEdit = ( config: { validate: (value: any) => validate(value, attribute, attribute.optional), }, - isOptionnal: attribute.optional, + isOptional: attribute.optional, isReadOnly: attribute.read_only, isProtected: getIsDisabled({ owner: row && row[attribute.name]?.owner, @@ -206,7 +206,7 @@ const getFormStructureForCreateEdit = ( config: { validate: (value: any) => validate(value, undefined, relationship.optional), }, - isOptionnal: relationship.optional, + isOptional: relationship.optional, isProtected: getIsDisabled({ owner: row && row[relationship.name]?.properties?.owner, user, diff --git a/frontend/tests/mocks/data/accountToken.ts b/frontend/tests/mocks/data/accountToken.ts index 0012cdc8dc..2166bb831a 100644 --- a/frontend/tests/mocks/data/accountToken.ts +++ b/frontend/tests/mocks/data/accountToken.ts @@ -801,7 +801,7 @@ export const accountTokenFormStructure = [ value: null, options: { values: [] }, config: {}, - isOptionnal: true, + isOptional: true, isProtected: false, isUnique: false, }, @@ -813,7 +813,7 @@ export const accountTokenFormStructure = [ value: "06438eb2-8019-4776-878c-0941b1f1d1ec", options: { values: [] }, config: {}, - isOptionnal: false, + isOptional: false, isProtected: false, isUnique: true, }, @@ -825,7 +825,7 @@ export const accountTokenFormStructure = [ value: "2023-07-14T22:00:00.000Z", options: { values: [] }, config: {}, - isOptionnal: true, + isOptional: true, isProtected: false, isUnique: false, }, @@ -837,7 +837,7 @@ export const accountTokenFormStructure = [ value: "", options: { values: [] }, config: {}, - isOptionnal: true, + isOptional: true, isProtected: false, }, ]; From cad9aadcfa5451f85af25a267be08b8f6e1d891a Mon Sep 17 00:00:00 2001 From: Mark Michon Date: Tue, 21 Nov 2023 10:35:03 -0800 Subject: [PATCH 121/446] docs: docs cleanup for consistency (#1479) --- docs/guides/installation.md | 64 ++++++------ docs/infrahubctl/readme.md | 30 +++--- docs/python-sdk/readme.md | 31 +++--- docs/reference/configuration.md | 7 +- docs/reference/schema/readme.md | 84 +++++++++------- docs/release-notes/readme.md | 0 docs/topics/architecture.md | 51 +++++----- docs/topics/artifact.md | 56 +++++------ docs/topics/auth.md | 21 ++-- docs/topics/graphql.md | 74 +++++++------- docs/topics/local-demo-environment.md | 90 +++++++++-------- docs/topics/object-storage.md | 10 +- docs/topics/proposed-change.md | 21 ++-- docs/topics/readme.md | 2 +- docs/topics/transformation.md | 98 +++++++++++-------- docs/tutorials/getting-started/branches.md | 61 ++++++------ .../getting-started/creating-an-object.md | 4 +- .../getting-started/custom-api-endpoint.md | 18 ++-- .../getting-started/git-integration.md | 32 +++--- .../getting-started/graphql-mutation.md | 10 +- .../getting-started/graphql-query.md | 21 ++-- .../getting-started/historical-data.md | 2 +- .../introduction-to-infrahub.md | 23 +++-- .../getting-started/jinja2-integration.md | 49 +++++----- .../getting-started/lineage-information.md | 26 ++--- docs/tutorials/getting-started/readme.md | 33 ++++--- docs/tutorials/getting-started/schema.md | 29 +++--- 27 files changed, 518 insertions(+), 429 deletions(-) delete mode 100644 docs/release-notes/readme.md diff --git a/docs/guides/installation.md b/docs/guides/installation.md index 2cbc2c057c..c0f426df9e 100644 --- a/docs/guides/installation.md +++ b/docs/guides/installation.md @@ -3,78 +3,78 @@ icon: terminal --- # Installing Infrahub -Infrahub is composed of multiple components, the backend is mostly written in Python and the frontend in React. +Infrahub is composed of multiple components. The backend is mostly written in Python and the frontend in React. The main components are: -- A **Frontend** written in react -- An **API Server** written in Python with FastAPI -- A **Git agent** to manage the interaction with external Git repositories -- A **Graph Database** based on `Neo4j` 5.x or `memgraph` -- A **Message Bus** based on `RabbitMQ` + +- A **Frontend** written in react. +- An **API server** written in Python with FastAPI. +- A **Git agent** to manage the interaction with external Git repositories. +- A **Graph database** based on `Neo4j` 5.x or `memgraph`. +- A **Message bus** based on `RabbitMQ`. ## Docker Compose -Currently, the recommended way to run Infrahub is to use the docker-compose project included with the project combined with the helper commands defined in `invoke` +The recommended way to run Infrahub is to use the Docker Compose project included with the project combined with the helper commands defined in `invoke`. The pre-requisites for this type of deployment are to have: -- [Invoke](https://www.pyinvoke.org) (version 2 minimum) and Toml -- [Docker](https://docs.docker.com/engine/install/) (version 24.x minimum) - +- [Invoke](https://www.pyinvoke.org) (version 2 minimum) and TOML +- [Docker](https://docs.docker.com/engine/install/) (version 24.x minimum) -+++ Mac OS ++++ MacOS ### Invoke -On Mac OS Python is installed by default so you should be able to install `invoke` directly. -Invoke works best when you install it in the main Python but you can also install it in a virtual environment if you prefer. +On MacOS, Python is installed by default so you should be able to install `invoke` directly. +Invoke works best when you install it in the main Python environment, but you can also install it in a virtual environment if you prefer. To install `invoke` and `toml`, run the following command: -``` +```sh pip install invoke toml ``` ### Docker -For Docker, you can download Docker Desktop directly from Docker's website with the instructions https://docs.docker.com/desktop/install/mac-install/ +To install Docker, follow the [official instructions on the Docker website](https://docs.docker.com/desktop/install/mac-install/) for your platform. +++ Windows -The current recommendation for Windows is to install a Linux VM via WSL2 and follow the installation guide for Ubuntu. +On Windows, install a Linux VM via WSL2 and follow the installation guide for Ubuntu. !!! -The native support on Windows is currently under investigation and is being tracked in the [issue 794](https://github.com/opsmill/infrahub/issues/794). +The native support on Windows is currently under investigation and is being tracked in [issue 794](https://github.com/opsmill/infrahub/issues/794). Please add a comment to the issue if this is something that would be useful to you. !!! +++ Ubuntu !!!warning -On Ubuntu, depending on which distribution you're running there is a good chance your version of Docker might be outdated. +On Ubuntu, depending on which distribution you're running, there is a good chance your version of Docker might be outdated. Please ensure your installation meets the version requirements mentioned below. !!! ### Invoke -Invoke is a Python package that is usually installed with `pip install invoke toml`. -If Python is not already installed on your system you'll need to install it first with `sudo apt install python3-pip` +Invoke is a Python package commonly installed by running `pip install invoke toml`. +If Python is not already installed on your system, install it first with `sudo apt install python3-pip`. ### Docker +Check if Docker is installed and which version is installed with `docker --version` +The version should be at least `24.x`. If the version is `20.x`, it's recommended to upgrade. -You can check if docker is installed and which version of docker is installed with `docker --version` -The version should be at least `24.x`. if the version is `20.x` it's recommended to upgrade. - -[This tutorial (for Ubuntu 22.04) explains how to install the latest version of docker on Ubuntu](https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-on-ubuntu-22-04) +[This tutorial (for Ubuntu 22.04) explains how to install the latest version of docker on Ubuntu](https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-on-ubuntu-22-04). +++ Other -The deployment should work on any systems that can run a modern version of Docker and Python. +The deployment should work on any systems that can run a modern version of Docker and Python. Please reach out if you need some help and feel free to send a PR with the installation instructions for your platform. +++ -Once docker desktop and invoke are properly installed you can build start Infrahub with the following command -``` +Once docker desktop and invoke are installed you can build, start, and initialize the Infrahub demo environment with the following command: + +```sh invoke demo.build demo.start demo.load-infra-schema demo.load-infra-data ``` @@ -82,10 +82,10 @@ invoke demo.build demo.start demo.load-infra-schema demo.load-infra-data ## GitPod -The project is also pre-configured to run in GitPod +The project is also pre-configured to run in GitPod. !!! -GitPod provides a Cloud Development Environment that makes it very easy to run any project right within your browser. +GitPod provides a Cloud Development Environment that allows you to run any project right within your browser. !!! GitPod has a generous free tier of 50/hours per month for free. @@ -95,6 +95,6 @@ GitPod has a generous free tier of 50/hours per month for free. ## K8s with Helm Chart -The support for K8s is not yet available but we are actively tracking this effort in our short/mid-term roadmap -https://github.com/opsmill/infrahub/issues/506 -Please reach out and let us know you are interested in this feature, it's always helpful to prioritize what the team needs to focus on. +Support for K8s is not yet available, but we are actively tracking this effort in our short/mid-term roadmap. You can follow [this issue for updates](https://github.com/opsmill/infrahub/issues/506). + +Please reach out and let us know if you are interested in this feature. It helps us prioritize what the team needs to focus on. diff --git a/docs/infrahubctl/readme.md b/docs/infrahubctl/readme.md index 8d321a2a33..fcd65ca486 100644 --- a/docs/infrahubctl/readme.md +++ b/docs/infrahubctl/readme.md @@ -3,27 +3,27 @@ `infrahubctl` is a command line utility designed to help with the day to day management of an Infrahub installation. It's meant to run on any laptop or server and it communicates with a remote Infrahub server over the network. -`infrahubctl` can help you to -- Manage the branches in Infrahub : List, Create, Merge, Rebase, Delete -- Manage the schema and load new schema files into Infrahub -- Execute any Python script that requires access to the Python SDK -- Render a Jinja Template locally for troubleshooting -- Execute a GraphQL query store in a Git repository for troubleshooting -- Validate that input files conform with the format expected by Infrahub +`infrahubctl` can help you to: +- Manage the branches in Infrahub: List, Create, Merge, Rebase, Delete. +- Manage the schema and load new schema files into Infrahub. +- Execute any Python script that requires access to the Python SDK. +- Render a Jinja Template locally for troubleshooting. +- Execute a GraphQL query store in a Git repository for troubleshooting. +- Validate that input files conform with the format expected by Infrahub. ## Configuration -`infrahubctl` requires a minimal set of configuration in order to connect to the right Infrahub server with the correct credentials. These settings can be provided either in a configuration file `infrahubctl.toml` or via environment variables. +`infrahubctl` requires a minimum configuration in order to connect to the right Infrahub server with the correct credentials. These settings can be provided either in a configuration file, `infrahubctl.toml`, or via environment variables. -### Environment Variables +### Environment variables -| Name | Example value | -| -- | -- | -| `INFRAHUB_ADDRESS` | http://localhost:8000 | -| `INFRAHUB_API_TOKEN` | `06438eb2-8019-4776-878c-0941b1f1d1ec` | -| `INFRAHUB_DEFAULT_BRANCH` | main | +| Name | Example value | +| ------------------------- | -------------------------------------- | +| `INFRAHUB_ADDRESS` | http://localhost:8000 | +| `INFRAHUB_API_TOKEN` | `06438eb2-8019-4776-878c-0941b1f1d1ec` | +| `INFRAHUB_DEFAULT_BRANCH` | main | -> the location of a configuration file can be also provided via environment variable : `INFRAHUBCTL_CONFIG` +> You can also provide the location of a configuration file via the environment variable `INFRAHUBCTL_CONFIG`. ### `infrahubctl.toml` file diff --git a/docs/python-sdk/readme.md b/docs/python-sdk/readme.md index e2c269ecf4..2968f87067 100644 --- a/docs/python-sdk/readme.md +++ b/docs/python-sdk/readme.md @@ -4,58 +4,63 @@ A Python SDK for Infrahub greatly simplifies how we can interact with Infrahub p ## Installation -> The Python SDK is currently hosted in the same repository as Infrahub, but once both reaches a better maturity state, the plan is to make it easy to install the SDK as a stand alone package. +> The Python SDK is currently hosted in the same repository as Infrahub, but once both reach a better maturity state, the plan is to make it possible to install the SDK as a stand alone package. -For now, the recommendation is to clone the main Infrahub repository on your file system and to install the entire infrahub package in your own repository using a relative path with the `--editable` flag. +For now, the recommendation is to clone the main Infrahub repository on your file system and to install the entire Infrahub package in your own repository using a relative path with the `--editable` flag. ``` -poetry add --editable +poetry add --editable ``` -## Getting Started +## Getting started -The SDK supports both synchronous and asynchronous Python. The default asynchronous version is provided by the `InfrahubClient` class while the synchronous version is using the `InfrahubClientSync` class. +The SDK supports both synchronous and asynchronous Python. The default asynchronous version is provided by the `InfrahubClient` class while the synchronous version uses the `InfrahubClientSync` class. -### Dynamic Schema Discovery +### Dynamic schema discovery -By default, the Python client will automatically gather the active schema from Infrahub and all methods will be autogenerated based on that. +By default, the Python client will automatically gather the active schema from Infrahub and all methods will generate based on that. +++ Async + ```python from infrahub_sdk import InfrahubClient client = await InfrahubClient.init(address="http://localhost:8000") ``` + +++ Sync + ```python from infrahub_sdk import InfrahubClientSync client = InfrahubClientSync.init(address="http://localhost:8000") ``` -+++ ++++ ### Authentication -The SDK is using a Token based authentication method to authenticate with the API and GraphQL +The SDK is using a token-based authentication method to authenticate with the API and GraphQL. -The token can either be provided with `config=Config(api_token="TOKEN")` at initialization time or it can be automatically retrieved -from the environment variable `INFRAHUB_SDK_API_TOKEN` +The token can either be provided with `config=Config(api_token="TOKEN")` at initialization time or it can be retrieved automatically from the environment variable `INFRAHUB_SDK_API_TOKEN`. -> In the demo environment the default token for the Admin account is `06438eb2-8019-4776-878c-0941b1f1d1ec` +> In the demo environment, the default token for the Admin account is `06438eb2-8019-4776-878c-0941b1f1d1ec`. +++ Async + ```python from infrahub_sdk import InfrahubClient, Config client = await InfrahubClient.init(config=Config(api_token="TOKEN")) ``` + +++ Sync + ```python from infrahub_sdk import InfrahubClientSync, Config client = InfrahubClientSync.init(config=Config(api_token="TOKEN")) ``` -+++ ++++ diff --git a/docs/reference/configuration.md b/docs/reference/configuration.md index bbce93e232..cbc2f51b2b 100644 --- a/docs/reference/configuration.md +++ b/docs/reference/configuration.md @@ -5,14 +5,19 @@ icon: tools order: 900 --- # Configuration File + !!!warning Under Construction + This page is still under construction and is not available yet.
Please reach out in Slack if you have some questions about the **Configuration File** + !!! Until a better documentation is available, the best reference to understand what options are available in the configuration file is the code itself. The configuration file format is defined in Pydantic models in `infrahub/config.py` ==- Explore the Source Code for the configuration File + :::code source="../../backend/infrahub/config.py" ::: -==- \ No newline at end of file + +==- diff --git a/docs/reference/schema/readme.md b/docs/reference/schema/readme.md index 61c59c21cf..e785d44732 100644 --- a/docs/reference/schema/readme.md +++ b/docs/reference/schema/readme.md @@ -1,24 +1,25 @@ # Schema -In Infrahub, the schema is at the center of most things and our goal is to provide as much flexibility as possible to the users to extend and customize the schema. +In Infrahub, the schema is at the center of most things and our goal is to provide as much flexibility as possible to allow users to extend and customize the schema. -Out of the box, Infrahub doesn't have a schema for most things and it's up to the users to load a schema that fits their needs. Over time we are planning to maintain different schemas for the common type of use cases, but for now, we are providing one example schema to model a simple network with basic objects like Device, Interface, IPAddress etc +Out of the box, Infrahub doesn't have a schema for most things and it's up to users to load a schema that fits their needs. Over time we plan to maintain different schemas for the common types of use cases, but for now, we are providing one example schema to model a basic network with objects like Device, Interface, IPAddress, etc. -Unlike traditional databases that can only have one schema at a time, in Infrahub, it is possible to have a different schema per branch. This is possible because the schema itself is stored in the database like any other object. +Unlike traditional databases that can only have one schema at a time, in Infrahub it is possible to have a different schema per branch. This is possible because the schema itself is stored in the database like any other object. New schema can be uploaded via the `infrahubctl schema load` command or via the REST API directly. !!!info -In the Tech Preview not all features of the schema are available yet, there are still some important changes coming like the support for schema migration and schema dependencies. +In the Tech Preview not all features of the schema are available yet. There are still some important changes coming like support for schema migration and schema dependencies. !!! ## Namespace, Node, Attributes, Relationships & Generics -The schema is composed of 4 primary types of object: [!badge Nodes] that are themselves composed of [!badge Attributes] and [!badge Relationships] and finally [!badge Generics] -- A [!badge Node] in Infrahub represents a `Model`. -- An [!badge Attribute] represents a direct value associated with a [!badge Node] like a `Text`, a `Number` etc ... -- A [!badge Relationship] represents a unidirectional link between 2 [!badge Node], a [!badge Relationship] can be of cardinality `one` or `many`. -- A [!badge Generics] can be used to share some attributes between multiple [!badge Node], if you're familiar with programming concept, it's close to class inheritance. +The schema is composed of 4 primary types of objects: `Nodes`- that are themselves composed of `Attributes` and `Relationships` and finally `Generics`. + +- A `Node` in Infrahub represents a `Model`. +- An `Attribute` represents a direct value associated with a `Node` like a `Text`, a `Number` etc ... +- A `Relationship` represents a unidirectional link between 2 `Node`, a `Relationship` can be of cardinality `one` or `many`. +- A `Generic` can be used to share some attributes between multiple `Node`, if you're familiar with programming concept, it's close to class inheritance. In the example below, the node `Person` has 2 attributes (`name` and `description`) and the node `Car` has 1 attribute (`model`) and 1 relationship to `Person`, identified by `owner``. @@ -46,7 +47,7 @@ nodes: kind: Attribute ``` -[!badge Node], [!badge Attribute] and [!badge Relationship] are defined by their `kind`. While the name and the namespace of the node are up to the creator of the schema, the kinds for the attributes and the relationships are coming from Infrahub. The `kind` of an attribute, or a relationship, is very important because it defined how each element will be represented in GraphQL and the UI. +`Node`, `Attribute`, and `Relationship` are defined by their `kind`. While the name and the namespace of the node are up to the creator of the schema, the kinds for the attributes and the relationships are coming from Infrahub. The `kind` of an attribute, or a relationship, is important because it defines how each element is represented in GraphQL and the UI. > The `kind` of a model is generated by concatenating the `namespace` and the `name`. @@ -57,14 +58,14 @@ nodes: - `Boolean`: Flag that can be either True or False - `DateTime`: A Data and a Time - `Email`: Email address -- `Password`: A Text String that should be offuscated. +- `Password`: A Text String that should be obfuscated - `URL`: An URL to a website or a resource over HTTP - `File`: Path to a file on the filesystem - `MacAddress`: Mac Address following the format (XX:XX:XX:XX:XX:XX) -- `Color`: A html color +- `Color`: An HTML color - `Bandwidth`: Bandwidth in kbps -- `IPHost`: Ip Address in either IPV4 or IPv6 format -- `IPNetwork`: Ip Network in either IPV4 or IPv6 format +- `IPHost`: IP Address in either IPV4 or IPv6 format +- `IPNetwork`: IP Network in either IPV4 or IPv6 format - `Checkbox`: Duplicate of `Boolean` - `List`: List of any value - `JSON`: Any data structure compatible with JSON @@ -74,12 +75,13 @@ nodes: - `Generic`: Default relationship without specific significance - `Attribute`: Relationship of type Attribute are represented in the detailed view and the list view -- `Component`: Indicate a relationship with another node that is a component of the current node, Example: Interface is a component to a Device -- `Parent`: Indicate a relationship with another node that is a parent to the current node, Example: Device is a parent to an Interface -- `Group`: Indicate a relationship to a member or a subscriber of a group. +- `Component`: Indicate a relationship with another node that is a component of the current node. Example: Interface is a component to a Device +- `Parent`: Indicate a relationship with another node that is a parent to the current node. Example: Device is a parent to an Interface +- `Group`: Indicate a relationship to a member or a subscriber of a group ==- Attribute Kinds Behavior in the UI -| Kind | Display in List View | Display in Detailed View | { class="compact" } | +{ class="compact" } +| Kind | Display in List View | Display in Detailed View | | ------------ | --------------------- | ------------------------- | | `ID` | No | Yes | | `Text` | Yes | Yes | @@ -100,8 +102,8 @@ nodes: | `Any` | No | Yes | ==- Relationship Kinds Behavior in the UI - -| ID | cardinality | Display in List View | Display in Detailed View | Display in Tab | { class="compact" } | +{ class="compact" } +| ID | cardinality | Display in List View | Display in Detailed View | Display in Tab | | ----------- | ----------- | --------------------- | ------------------------- | -------------- | | `Generic` | `one` | No | Yes | No | | `Generic` | `many` | No | No | Yes | @@ -116,13 +118,14 @@ nodes: ## Generics -A Generic can be used to: +A generic can be used to: + - Share multiple attributes or relationships between different types of nodes. - Connect multiple types of Node to the same relationship. - Define Attribute and Relationship on a specific list of nodes and avoid creating attributes for everything -In the example below, we took the schema that we used previously and we refactored it using Generic -Now `Car` is a Generic with 2 attributes and 1 relationship and 2 models `ElectricCar` and `GazCar` are referencing it. +In the example below, we took the schema that we used previously and refactored it using a generic +Now `Car` is a generic with 2 attributes and 1 relationship and 2 models. `ElectricCar` and `GazCar` are referencing it. In the GraphQL schema, `ElectricCar` and `GazCar` will have all the attributes and the relationships of `Car` in addition to the one defined under their respective section. ```yaml @@ -173,14 +176,15 @@ nodes: ## Branch Support -By default, all models defined in the schema will be **branch aware** which means that any changes to an object based on a **branch aware** model will be local to the branch and will not affect the other branches. +By default, all models defined in the schema will be **branch-aware** which means that any changes to an object based on a **branch-aware** model will be local to the branch and will not affect the other branches. + +A model can also be configured as: -A model can also be configured as : - **branch agnostic**: All changes to an object based on a **branch agnostic** model will automatically be available in all branches. - **branch local**: All changes will stay local to the branch. A model in **branch local** mode will not be affected by the Diff and the Merge. ### Summary - +{ class="compact" } | Branch Support | Description | Diff | Merge | Rebase | | -------------- | ------------------------------------------------------------------------------------ | ---- | ----- | ------ | | **Aware** | All changes will be local to the branch and can be merged back into the main branch. | Yes | Yes | Yes | @@ -189,9 +193,10 @@ A model can also be configured as : ### Branch Agnostic -In the frontend, the API or the GraphQL endpoint, **branch agnostic** objects can be modified on any branch, no restrictions apply. -To configure a model as **branch agnostic** you need to set the option `branch` to `agnostic` in the schema +In the frontend, the API, or the GraphQL endpoint **branch-agnostic** objects can be modified on any branch—no restrictions apply. + +To configure a model as **branch-agnostic** you need to set the option `branch` to `agnostic` in the schema ```yaml nodes: @@ -203,9 +208,9 @@ nodes: name: name ``` -### Attribute and Relationship +### Attribute and relationships -Attributes and Relationships can be configured as **branch aware**, **branch agnostic** or **branch local** too, independently of the configuration of the model itself using the parameter: `branch` +Attributes and relationships can be configured as **branch-aware**, **branch-agnostic**, or **branch-local** too, independently of the configuration of the model itself using the parameter `branch`. ```yaml nodes: @@ -219,18 +224,21 @@ nodes: ``` By default, if a specific value is not defined: + - **attributes** will inherit the configuration of their parent model. - **relationships** will become: - - **branch agnostic** only if both models, on each end of the relationship, are **branch agnostic**. If either model is **branch aware** the relationship will be set as **branch aware**. - - **branch local** if either model, on each end of the relationship, is **branch local**. + - **branch-agnostic** only if both models on each end of the relationship are **branch-agnostic**. If either model is **branch-aware** the relationship will be set as **branch-aware**. + - **branch-local** if either model, on each end of the relationship, is **branch-local**. ## Schema File -The recommended way to manage and load a schema is to create a schema file in Yaml format, with a schema file it's possible to +The recommended way to manage and load a schema is to create a schema file in YAML format. With a schema file it's possible to: + - Define new nodes - Extend nodes, by adding attributes or relationships to the existing nodes -At a high level, the format of the schema file looks like that. +At a high level, the format of the schema file looks like the following: + ```yaml --- version: '1.0' @@ -251,14 +259,16 @@ Schema files can be loaded into Infrahub with the `infrahubctl` command or direc #### infrahubctl command -The `infrahubctl` command can be used to load indivual schema file or multiple files as part of a directory. -``` +The `infrahubctl` command can be used to load individual schema files or multiple files as part of a directory. + +```sh infrahubctl schema load ``` #### Git integration -The schemas that should be loaded must be declared in the ``.infrahub.yml`` directory, under schemas. +The schemas that should be loaded must be declared in the ``.infrahub.yml`` directory, under schemas. + > Individual files and directory are both supported. ```yaml diff --git a/docs/release-notes/readme.md b/docs/release-notes/readme.md deleted file mode 100644 index e69de29bb2..0000000000 diff --git a/docs/topics/architecture.md b/docs/topics/architecture.md index bf566dae03..ffedbde4fe 100644 --- a/docs/topics/architecture.md +++ b/docs/topics/architecture.md @@ -1,60 +1,65 @@ --- label: Architecture layout: default -order: 1000 --- -# Architecture Diagram +# Architecture diagram ![](../media/high_level_architecture.excalidraw.svg) -## Infrahub Components +## Infrahub components -### API Server +### API server Language: Python -The API Server is serving the REST API and the GraphQL endpoints. -Internally the API Server is built with FastAPI as the web framework and Graphene to generate the GraphQL endpoints. +The API server delivers the REST API and the GraphQL endpoints. +Internally, the API server is built with FastAPI as the web framework and Graphene to generate the GraphQL endpoints. !!! + Multiple instance of the API Server can run at the same time to process more requests. + !!! -### Git Agent +### Git agent Language: Python -The Git agent is responsible for managing all the content related to the Git repositories, it organizes the file systems in order to quickkly access any relevant commit. The Git Agent is periodically pulling the Git Server for updates and it's listening to the RPC channel on the event bus for tasks to execute. +The Git agent is responsible for managing all the content related to the Git repositories. It organizes the file systems in order to quickly access any relevant commit. The Git Agent periodically pulls the Git server for updates and listens to the RPC channel on the event bus for tasks to execute. + Some of the tasks that can be executed on the Git agent includes: -- Rendering a Jinja template -- Rendering a transform function -- Executing a check -- All Git operations (pull/merge/diff) + +- Rendering a Jinja template. +- Rendering a transform function. +- Executing a check. +- All Git operations (pull/merge/diff). !!! -Multiple instance of the Git Agent can run at the same time to process more requests. + +Multiple instance of the Git agent can run at the same time to process more requests. + !!! ### Frontend Language: React -## External Systems +## External systems -### Graph Database +### Graph database -The Graph Database is based on Bolt and Cyper. Currently we have validated both Neo4j 5.x and Memgraph as possible options. -Neo4j is a production grade, battle tested graph database that is used in 1000s of deployments around the world. -Memgraph is a lightweight, very fast, in-memory database that works great for testing and demo. +The Graph database is based on Bolt and Cyper. Currently, we have validated both Neo4j 5.x and Memgraph as possible options. +Neo4j is a production grade, battle tested graph database that is used in thousands of deployments around the world. +Memgraph is a lightweight, very fast, in-memory database that works great for testing and demos. -### Message Bus +### Message bus -The message bus is based on RabbitMQ, it supports both a fanout channel to distribute messages to all members at the same time and a RPC framework to distribute work Syncronously. +The message bus is based on RabbitMQ. It supports both a fanout channel to distribute messages to all members at the same time and a RPC framework to distribute work synchronously. ### Cache -The cache is based on Redis, it's mainly used as a central point to support the distributed lock systems between all the different component of the system +The cache is based on Redis. It's mainly used as a central point to support the distributed lock systems between all the different component of the system. -### Git Server (Github/Gitlab) +### Git server (GitHub/GitLab) -Any Git server. The most popular being : GitHub, GitLab or Bitbucket \ No newline at end of file +Any Git server. The most popular being: GitHub, GitLab, or Bitbucket. diff --git a/docs/topics/artifact.md b/docs/topics/artifact.md index 7b07411e00..c711d9bfa0 100644 --- a/docs/topics/artifact.md +++ b/docs/topics/artifact.md @@ -1,43 +1,46 @@ --- label: Artifact layout: default -order: 800 --- # Artifact -An artifact is the result of a [Transformation](./transformation.md) for a specific context and/or object, it can have different format either in plain text or JSON. +An artifact is the result of a [Transformation](./transformation.md) for a specific context and/or object. It can be either plain text or JSON format. !!!success Examples -- For a network device, an artifact can be used to track the configuration generated from a Jinja template (RFile) -- For a Security Device, an artifact can be the list of rules in JSON in the format of your choice generated by a Python Transformation -An artifact can also represent the configuration of a DNS server or the configuration of a specific Virtual IP on a load balancer. + +- For a network device, you can use an artifact to track the configuration generated from a Jinja template (RFile). +- For a Security Device, an artifact can be the list of rules in JSON in the format of your choice generated by a Python Transformation. +- An artifact can also represent the configuration of a DNS server or the configuration of a specific Virtual IP on a load balancer. + !!! While it's always possible to generate [Transformations](./transformation.md) on demand via the API, having an Artifact provide some additional benefits: -- **Caching** : Generated Artifact are stored in the internal [object storage](./object-storage.md). For resource intensive Transformation, it will significantly reduce the load of the system if an artifact can be serve from the cache instead of regenerating each time. -- **Traceability** : Past values of an artifact remains available. In a future release, it will be possible to compare the value of an artifact over time. -- **Peer Review** : Artifact are automatically part of the Proposed Change review process -While the content of an artifact can change, it's identifier will remain the same over time. +- **Caching**: Generated artifacts are stored in the internal [object storage](./object-storage.md). For resource intensive transformations, it will significantly reduce the load of the system if an artifact can be serve from the cache instead of regenerating each time. +- **Traceability**: Past values of an artifact remain available. In a future release, it will be possible to compare the value of an artifact over time. +- **Peer Review**: Artifacts are automatically part of the [Proposed Change](./proposed-change.md) review process. + +While the content of an artifact can change, its identifier will remain the same over time. ## High level design -Artifacts are defined by grouping a [Transformation](./transformation.md) with a Group of targets in an Artifact Definition. +Artifacts are defined by grouping a [transformation](./transformation.md) with a group of targets in an *Artifact Definition*. + +An **artifact definition** centralizes all the information required to generate an artifact. -An Artifact Definition centralize all the information required to generate an artifact - Group of targets - Transformation - Format of the output -- Information to extract from each target that must be passed to the Transformation. +- Information to extract from each target that must be passed to the transformation. ![](../media/artifact.excalidraw.svg) -## Artifact Definition +## Creating an artifact definition -Artifact Definition can be created via the Frontend, via GraphQL or via a Git Repository +Artifact definitions can be created via the frontend, via GraphQL or via a Git repository -For Infrahub to automatically import an ArtifactDefinition from a Repository, it must be declare in the `.infrahub.yml` file at the root of the repository under the key `artifact_definitions`. +For Infrahub to automatically import an artifact definition from a repository, it must be declared in the `.infrahub.yml` file at the root of the repository under the key `artifact_definitions`. ```yaml --- @@ -51,17 +54,15 @@ artifact_definitions: transformation: "" ``` -## Artifact - -Artifact can be accessed via the frontend and via GraphQL but they shouldn't be manually created, all artifacts should be generated and managed by Infrahub. +You can access an artifact via the frontend or GraphQL, but you shouldn't manually create them. Infrahub should generate and manage all artifacts. ## Examples -### Startup Configuration for Edge devices +### Startup configuration for edge devices -The project [infrahub-demo-edge](https://github.com/opsmill/infrahub-demo-edge) includes most elements to generate the startup configuration of all Edge Devices. +The project [infrahub-demo-edge](https://github.com/opsmill/infrahub-demo-edge) includes most elements required to generate the startup configuration of all edge devices. -in the `.infrahub.yml` the actifact definition is configured as follow: +In the `.infrahub.yml` the artifact definition is configured as follows: ```yaml artifact_definitions: @@ -74,12 +75,7 @@ artifact_definitions: transformation: "device_startup" ``` -- `transformation: "device_startup"` reference the Transformation RFile also define in the same repository. - - The GraphQLQuery `device_startup_info` is indirectly connected to the Artifact Definition via the Transformation. -- `targets: "edge_router"` reference a group of Edge routers named `edge_router`, it must be already present in Infrahub -- `parameters` define the information that must be extracted from each member of the group and that must be passed to the Transformation. Here the Transformation `device_startup` must have a parameter `device` (coming from the GraphQL Query) to render the configuration properly. The value of `device` for each member of the group will be constructed by accessing the value of the name `name__value` - - - - - +- `transformation: "device_startup"` references the transformation RFile and defines it in the same repository. +- The GraphQLQuery `device_startup_info` is indirectly connected to the artifact definition via the transformation. +- `targets: "edge_router"` references a group of Edge routers named `edge_router`. It must be already present in Infrahub. +- `parameters` define the information that must be extracted from each member of the group and that must be passed to the transformation. Here, the transformation `device_startup` must have a parameter `device` (coming from the GraphQL Query) to render the configuration properly. The value of `device` for each member of the group will be constructed by accessing the value of the name `name__value`. diff --git a/docs/topics/auth.md b/docs/topics/auth.md index 9e55d241a3..2af9e9b25f 100644 --- a/docs/topics/auth.md +++ b/docs/topics/auth.md @@ -1,36 +1,39 @@ --- label: User Management and Authentication layout: default -order: 900 --- -### User Management and Authentication +### User management and authentication Infrahub now supports standard user management and authentication systems. A user account can have 3 levels of permissions + - `admin` - `read-write` - `read-only` -By default, Infrahub will allow anonymous access in read-only. it's possible to disable this feature via the configuration `main.allow_anonymous_access` or via the environment variable `INFRAHUB_ALLOW_ANONYMOUS_ACCESS` - +By default, Infrahub will allow anonymous access in read-only. It's possible to disable this via the configuration `main.allow_anonymous_access` or via the environment variable `INFRAHUB_ALLOW_ANONYMOUS_ACCESS`. #### Authentication mechanisms Infrahub supports two authentication methods -- JWT token: Short live token that are generated on demand from the API -- API Token: Long live token generated ahead of time. -> API token can be generated via the user profile page or via the Graphql interface. +- JWT token: Short life tokens generated on demand from the API. +- API Token: Long life tokens generated ahead of time. + +> API tokens can be generated via the user profile page or via the Graphql interface. | | JWT | TOKEN | -|--------------------|------|-------| +| ------------------ | ---- | ----- | | API / GraphQL | Yes | Yes | | Frontend | Yes | No | | Python SDK | Soon | Yes | | infrahubctl | Soon | Yes | | GraphQL Playground | No | Yes | -While using the API the Authentication Token must be provided in a header named `X-INFRAHUB-KEY` +!!! + +While using the API, the authentication token must be provided in a header named `X-INFRAHUB-KEY`. +!!! diff --git a/docs/topics/graphql.md b/docs/topics/graphql.md index 53a0bdf771..6dc842ce11 100644 --- a/docs/topics/graphql.md +++ b/docs/topics/graphql.md @@ -1,26 +1,25 @@ --- label: GraphQL Query layout: default -order: 800 --- # GraphQL -The GraphQL interface is the main interface to interact with Infrahub, the GraphQL Schema is automatically generated based on the core models and the user-defined schema models. +The GraphQL interface is the main interface to interact with Infrahub. The GraphQL schema is automatically generated based on the core models and the user-defined schema models. -The endpoint to interact with the main branch is accessible at `https:///graphql`. -To interact with a branch the url must include the name of the branch. `https:///graphql/` +The endpoint to interact with the main branch is accessible at `https:///graphql`. +To interact with a branch the URL must include the name of the branch, such as `https:///graphql/`. -## Query & Mutations +## Query & mutations -For each model in the schema, a GraphQL Query and 3 Mutations will be generated based on the namespace and the name of the model. +For each model in the schema, a GraphQL query and 3 mutations will be generated based on the namespace and the name of the model. -For example, for the model `CoreRepository` the following Query and Mutations have been generated: +For example, for the model `CoreRepository` the following query and mutations have been generated: - `Query` : **CoreRepository** - `Mutation` : **CoreRepositoryCreate** - `Mutation` : **CoreRepositoryUpdate** - `Mutation` : **CoreRepositoryDelete** -### Query Format +### Query format The top level query for each model will always return a list of objects and the query will have the following format `CoreRepository` > `edges` > `node` > `display_label` @@ -53,12 +52,12 @@ At the object level, there are mainly 3 types of resources that can be accessed, #### Attribute -Each Attribute is its own object in GraphQL to expose the value and all the metadata. +Each attribute is its own object in GraphQL to expose the value and all the metadata. In the query below, to access the attribute **name** of the object the query must be `CoreRepository` > `edges` > `node` > `name` > `value`. At the same level all the metadata of the attribute are also available example : `is_protected`, `is_visible`, `source` & `owner` -```graphql #6-14 Example query to access the value and the properties of the Attribute 'name' +```graphql #6-14 Example query to access the value and the properties of the attribute 'name' query { CoreRepository { count @@ -81,7 +80,7 @@ query { #### Relationship of `Cardinality One` -A Relationship to another model with a cardinality of `One` will be represented with a `NestedEdged` object composed of a `node` and a `properties` objects. The `node` gives access to the remote `node` (the peer of the relationship) while `properties` gives access to the properties of the relationship itself. +A relationship to another model with a cardinality of `One` will be represented with a `NestedEdged` object composed of a `node` and a `properties` objects. The `node` gives access to the remote `node` (the peer of the relationship) while `properties` gives access to the properties of the relationship itself. ```graphql #6-19 Example query to access the peer and the properties of the relationship 'account', with a cardinality of one. query { @@ -111,7 +110,7 @@ query { #### Relationship of `Cardinality Many` -A Relationship with a cardinality of `Many` will be represented with a `NestedPaginated` object composed. It was the same format as the top level `PaginatedObject` with `count` and `edges` but the child element will expose both `node` and `properties`. The `node` gives access to the remote `node` (the peer of the relationship) while `properties` gives access to the properties of the relationship itself. +A relationship with a cardinality of `Many` will be represented with a `NestedPaginated` object composed. It was the same format as the top level `PaginatedObject` with `count` and `edges` but the child element will expose both `node` and `properties`. The `node` gives access to the remote `node` (the peer of the relationship) while `properties` gives access to the properties of the relationship itself. ```graphql #6-20 Example query to access the relationship 'tags', with a cardinality of Many. query { @@ -140,19 +139,21 @@ query { } ``` -### Mutations Format +### Mutations format + +The format of the mutation to `Create` and `Update` an object has some similarities with the query format. The format will be slightly different for: -The format of the Mutation to Create & Update an object have some similarities with the Query format and similartly, the format will be slightly different for : - An `Attribute` -- A Relationship of `Cardinality One` -- A Relationship of `Cardinality Many` +- A relationship of `Cardinality One` +- A relationship of `Cardinality Many` + +#### Create and update -#### Create & Update +To `Create` or `Update` an object, the mutations will have the following properties. -To `Create` or `Update` an object, the mutations will have the following properties -- The Input for the mutation must be provided inside `data` +- The input for the mutation must be provided inside `data`. - All mutations will return `ok` and `object` to access some information after the mutation has been executed. -- For `Update`, it is mandatory to provide an `id` +- For `Update`, it is mandatory to provide an `id`. ```graphql mutation { @@ -171,39 +172,38 @@ mutation { } ``` -## Branch Management +## Branch management -In addition to the Query and the Mutations automatically generated based on the schema, there are some Query and Mutations to interact with the Branches. +In addition to the queries and the mutations automatically generated based on the schema, there are some queries and mutations to interact with the branches. - **Query**: `Branch`, Query a list of all branches - **Mutation**: `BranchCreate`, Create a new branch -- **Mutation**: `BranchUpdate`, Update the descrition of a branch -- **Mutation**: `BranchDelete`, Delete an existing Branch -- **Mutation**: `BranchRebase`, Rebase an existing Branch with the main Branch -- **Mutation**: `BranchMerge`, Merge a Branch into main +- **Mutation**: `BranchUpdate`, Update the description of a branch +- **Mutation**: `BranchDelete`, Delete an existing branch +- **Mutation**: `BranchRebase`, Rebase an existing branch with the main branch +- **Mutation**: `BranchMerge`, Merge a branch into main - **Mutation**: `BranchValidate`, Validate if a branch has some conflicts ## GraphQLQuery -The GraphQLQuery Model has been designed to store a GraphQL Query in order to simplify its execution and to associate it with other internal objects like `Transformation`. +The `GraphQLQuery` model has been designed to store a GraphQL query in order to simplify its execution and to associate it with other internal objects like `Transformation`. -A GraphQLQuery object can be created directly from the API or it can be imported from a Git Repository. +A `GraphQLQuery` object can be created directly from the API or it can be imported from a Git repository. -Every time a GraphQLQuery is being Created or Updated, the content of the query will be analized to -- Ensure the query is valid and is compatible with the schema -- Extract some information about the query itself (see below) +Every time a `GraphQLQuery` is created or updated, the content of the query will be analyzed to: + +- Ensure the query is valid and compatible with the schema. +- Extract some information about the query itself (see below). ### Information extracted from the query -- Type of Operations present in the Query [Query, Mutation, Subscription] + +- Type of operations present in the Query [Query, Mutation, Subscription] - Variables accepted by the query - Depth, number of nested levels in the query - Height, total number of fields requested in the query - List of Infrahub models referenced in the query -### Import from a Git Repository - -The Git Agent will automatically try to import all files with the extension `.gql` into a GraphQLQuery with the name of the file as the name of the query. - - +### Import from a git repository +The git agent will automatically try to import all files with the extension `.gql` into a `GraphQLQuery` with the name of the file as the name of the query. diff --git a/docs/topics/local-demo-environment.md b/docs/topics/local-demo-environment.md index ab755920f5..327f873b33 100644 --- a/docs/topics/local-demo-environment.md +++ b/docs/topics/local-demo-environment.md @@ -1,65 +1,71 @@ --- label: Demo Environment layout: default -order: 100 --- # Local Demo Environment -A local environment based on Docker Composed is available for demo and testing. -It's designed to be controlled by `invoke` using a list of predefined commands - -| Command | Description | { class="compact" } -| --------------------- | ------------------------------------------------------------ | -| `demo.build` | Build an image with the provided name and python version. | -| `demo.init` | (deprecated) Initialize Infrahub database before using it the first time. | -| `demo.start` | Start a local instance of Infrahub within docker compose. | -| `demo.stop` | Stop the running instance of Infrahub. | -| `demo.destroy` | Destroy all containers and volumes. | -| `demo.cli-git` | Launch a bash shell inside the running Infrahub container. | -| `demo.cli-server` | Launch a bash shell inside the running Infrahub container. | -| `demo.debug` | Start a local instance of Infrahub in debug mode. | -| `demo.status` | Display the status of all containers. | -| `demo.load-infra-schema` | Load the infrastructure_base schema into Infrahub. | -| `demo.load-infra-data` | Generate some data representing a small networks with 6 devices. | +A local environment based on Docker Compose is available for demo and testing. +It's designed to be controlled by `invoke` using a list of predefined commands. + +{ class="compact" } +| Command | Description | +| ------------------------ | ------------------------------------------------------------------------- | +| `demo.build` | Build an image with the provided name and Python version. | +| `demo.init` | (deprecated) Initialize Infrahub database before using it the first time. | +| `demo.start` | Start a local instance of Infrahub within docker compose. | +| `demo.stop` | Stop the running instance of Infrahub. | +| `demo.destroy` | Destroy all containers and volumes. | +| `demo.cli-git` | Launch a bash shell inside the running Infrahub container. | +| `demo.cli-server` | Launch a bash shell inside the running Infrahub container. | +| `demo.debug` | Start a local instance of Infrahub in debug mode. | +| `demo.status` | Display the status of all containers. | +| `demo.load-infra-schema` | Load the `infrastructure_base` schema into Infrahub. | +| `demo.load-infra-data` | Generate some data representing a small network with 6 devices. | ## Topology -| Container Name | Image | Description | { class="compact" } -| --------------- | ------------------------ | ------------------------------------------------------ | -| **database** | memgraph/memgraph:2.11.0
or
neo4j:5.6-enterprise | Graph Database | -| **message-queue** | rabbitmq:3.12-management | Message bus based on RabbitMQ | -| **cache** | redis:7.2 | Cache based on Redis, mainly used for distributed lock | -| **infrahub-server** | Dockerfile | Instance of the API Server, running GraphQL | -| **infrahub-git** | Dockerfile | Instance of the Git Agent, managing the Git Repository | -| **frontend** | Dockerfile | Instance of the Frontend | +{ class="compact" } +| Container Name | Image | Description | +| ------------------- | ------------------------------------------------------ | ------------------------------------------------------ | +| **database** | memgraph/memgraph:2.11.0
or
neo4j:5.6-enterprise | Graph Database | +| **message-queue** | rabbitmq:3.12-management | Message bus based on RabbitMQ | +| **cache** | redis:7.2 | Cache based on Redis, mainly used for distributed lock | +| **infrahub-server** | Dockerfile | Instance of the API server, running GraphQL | +| **infrahub-git** | Dockerfile | Instance of the Git agent, managing the Git Repository | +| **frontend** | Dockerfile | Instance of the Frontend | [!ref Check the architecture diagram to have more information about each component](./architecture.md) -## Getting Started +## Getting started -### Pre-Requisite +### Prerequisites In order to run the demo environment, the following applications must be installed on the systems: + - [pyinvoke](https://www.pyinvoke.org/) - Docker & Docker Compose -> On a Laptop, both Docker & Docker Compose can be installed by installing [Docker Desktop](https://www.docker.com/products/docker-desktop/) +> On a Laptop, both Docker & Docker Compose can be installed by installing [Docker Desktop](https://www.docker.com/products/docker-desktop/). ### First utilization Before the first utilization you need to build the images for Infrahub with the command: -``` + +```sh invoke demo.build ``` + Initialize the database and start the application -``` + +```sh invoke demo.start ``` ### Load some data Once you have an environment up and running you can load your own schema or you can explore the one provided with the project using the following commands. -``` + +```sh invoke demo.load-infra-schema invoke demo.load-infra-data ``` @@ -70,39 +76,41 @@ invoke demo.load-infra-data - `invoke demo.stop` : Stop All the containers - `invoke demo.destroy` : Destroy all containers and volumes. - !!! + `invoke demo.debug` can be used as an alternative to `invoke demo.start`, the main difference is that it will stay *attached* to the containers and all the logs will be displayed in real time in the CLI. + !!! -## Advanced Settings +## Advanced settings ### Support for `sudo` -On a linux system, the system will try to automatically detect if `sudo` is required to run the docker command or not. +On a Linux system, the system will try to automatically detect if `sudo` is required to run the docker command or not. It's possible to control this setting with the environment variable: `INVOKE_SUDO` -``` +```sh export INVOKE_SUDO=1 to force sudo export INVOKE_SUDO=0 to disable it completely ``` ### Support for `pty` -On Linux and Mac OS, all commands will be executed with PTY enabled by default. +On Linux and MacOS, all commands will be executed with PTY enabled by default. It's possible to control this setting with the environment variable: `INVOKE_PTY` -``` +```sh export INVOKE_PTY=1 to force pty export INVOKE_PTY=0 to disable it completely ``` ## Troubleshooting -At First, it's recommended to check if all containers are still running using `invoke demo.status`. The 5 containers should be running and be present. -- If one is not running, you can try to restart it with `invoke demo.start` -- If the container is still not coming up, you can watch the logs with `docker logs ` (the container name will include the name of the project and a number like `infrahub-dev-infrahub-git-1` ) +It's recommended to check if all containers are still running using `invoke demo.status`. The 5 containers should be running and be present. + +- If one is not running, you can try to restart it with `invoke demo.start`. +- If the container is still not coming up, you can watch the logs with `docker logs ` (the container name will include the name of the project and a number, i.e., `infrahub-dev-infrahub-git-1` ). -If some containers are still not coming up, it's recommanded to start from a fresh install with `invoke demo.destroy`. \ No newline at end of file +If some containers are still not coming up, it's recommended to start from a fresh install with `invoke demo.destroy`. diff --git a/docs/topics/object-storage.md b/docs/topics/object-storage.md index fc9dd43b62..2ee7a8e205 100644 --- a/docs/topics/object-storage.md +++ b/docs/topics/object-storage.md @@ -1,17 +1,17 @@ --- label: Object Storage layout: default -order: 500 --- +# Object storage -Infrahub provides an interface to easily store and retrieve files in an object storage. The object storage interface is independent of the branches. +Infrahub provides an interface to store and retrieve files in an object storage. The object storage interface is independent of the branches. -Currently only a local backend is supported but the goal over time is to support multiple backend like AWS S3 to allow users to select where they would like their files to be stored. +Currently, Infrahub only supports a local backend. The goal over time is to support multiple backends, such as AWS S3, to allow users to select where they would like to store their files. -Currently the main interface to interact with the object storage is the REST API, 3 methods are supported +Currently the main interface to interact with the object storage is the REST API. 3 methods are supported: - GET /api/storage/object/{identifier} - POST /api/storage/upload/content - POST /api/storage/upload/file -Please check the API documentation for more details +Please check the API documentation for more details. diff --git a/docs/topics/proposed-change.md b/docs/topics/proposed-change.md index ef5907cb17..3345a04a5d 100644 --- a/docs/topics/proposed-change.md +++ b/docs/topics/proposed-change.md @@ -1,18 +1,27 @@ --- -label: Proposed Change +label: Proposed change layout: default -order: 150 --- +# Proposed change + A proposed change provides a way to review and discuss how two branches differ from each other and to merge a source branch into the target branch. For people with a development background, this will sound very familiar. It’s like a pull or merge request. The proposed change lets you compare two branches, run tests, and finally merge one branch into another. + ## Discussions and issues as part of the review -A reviewer of the proposed change can open discussions and write comments, request changes. Once you resolve any requested change, the reviewer would approve the proposed change before they merge it. + +A reviewer of the proposed change can open discussions, write comments, and request changes. Once you resolve any requested change, the reviewer would approve the proposed change before they merge it. + ## An alternative approach to diff -In a pull request in GitHub, the diff between two branches is a diff seen from a plain text point of view. A proposed change in Infrahub allows you to see changes in data, as well as the type of diff you’d see in Git. By combining the two, someone reviewing a proposed change in Infrahub can view the diff between [artifacts](artifact) on each branch. + +In a pull request on GitHub, the diff between two branches is seen from a plain text point of view. A proposed change in Infrahub allows you to see changes in data, as well as the type of diff you’d see in Git. By combining the two, someone reviewing a proposed change in Infrahub can view the diff between [artifacts](artifact) on each branch. + With this feature, you can create a new branch, change a node attribute in the database, and see how your modifications impact the rendered artifacts. It includes a diff view to see exactly how a configuration might change if the proposed change were to be accepted and merged. -## Continuous Integration - CI +## Continuous integration - CI + Just like you’d expect for a GitHub pull request, you can run checks on a proposed change during the review process and before merging. Infrahub will run data integrity checks between the proposed change branches. Besides this, Infrahub reports any merge conflicts for connected Git repositories. + Infrahub handles custom checks with code through Git repositories. These Checks let you verify the integrity of the database using your custom business logic. A check of this type could be anything you can imagine. An example could be to ensure that at least one router on each site is in an operational status as opposed to being in maintenance mode. ## Conflict resolution -Infrahub will prevent merging a proposed change if there is a data conflict between the branches. An example of such a conflict could be if someone were to update the same attribute of an object in both branches. In order to merge a proposed change that has conflicts, they need to be resolved. To resolve conflicts, you need to review data integrity checks and choose which branch to keep in the change checks section. \ No newline at end of file + +Infrahub will prevent merging a proposed change if there is a data conflict between the branches. An example of such a conflict could be if someone were to update the same attribute of an object in both branches. In order to merge a proposed change that has conflicts, they need to be resolved. To resolve conflicts, you need to review data integrity checks and choose which branch to keep in the change checks section. diff --git a/docs/topics/readme.md b/docs/topics/readme.md index da8ba9caf1..5cf645f8f0 100644 --- a/docs/topics/readme.md +++ b/docs/topics/readme.md @@ -5,5 +5,5 @@ - [Authentication](./auth.md) - [Artifact](./artifact.md) - [Object Storage](./object-storage.md) -- [Proposed Changes](./proposed-change.md) +- [Proposed Change](./proposed-change.md) - [Demo Environment](./local-demo-environment.md) \ No newline at end of file diff --git a/docs/topics/transformation.md b/docs/topics/transformation.md index 4fcdd375f4..f94efa02bd 100644 --- a/docs/topics/transformation.md +++ b/docs/topics/transformation.md @@ -1,59 +1,61 @@ --- label: Transformation layout: default -order: 900 --- # Transformation -A `Transformation` is a generic plugin to transform a dataset into a different format to simplify it's ingestion by a third party systems. +A `Transformation` is a generic plugin to transform a dataset into a different format to simplify it's ingestion by third-party systems. The output of a transformation can be either in JSON format or in plain text. -*Currently transformation must be written in Python but in the future more languages could be supported.* +>*Currently transformations must be written in Python, but in the future more languages could be supported.* !!!success Examples + - With the `Jinja Plugin` it's possible to generate any configuration files, in plain text format. -- With the `Python Plugin` its's possible to generate the payload expected by CloudFormation to configure a resource in AWS. +- With the `Python Plugin` it's possible to generate the payload expected by CloudFormation to configure a resource in AWS. + !!! ## High level design -A Transformation is composed of 2 main components: -- A **GraphQL Query** that will define what is the input data +A transformation is composed of 2 main components: + +- A **GraphQL query** that will define what the input data. - A **Transformation logic** that will process the data and transform it. ![](../media/transformation.excalidraw.svg) - !!! -The Transformation will automatically inherit the parameters (variables) defined by the GraphQL query. Depending on how the GraphQL query has been constructed, a transformation can be static or work for multiple objects. +The transformation will automatically inherit the parameters (variables) defined by the GraphQL query. Depending on how the GraphQL query has been constructed, a transformation can be static or work for multiple objects. !!! ==- Common parameters - -| Name | Type | Default | Required | { class="compact" } -| ------------------ | --------- | ----- | --- | -| **name** | `Text` | - | Yes | -| **label** | `Text` | - | No | -| **description** | `Text` | - | No | -| **timeout** | `Number` | 10 | No | -| **rebase** | `Boolean` | False | No | -| **query** | `Relationship`
CoreGraphQLQuery | - | Yes | -| **repository** | `Relationship`
CoreRepository | - | Yes | +{ class="compact" } +| Name | Type | Default | Required | +| --------------- | ----------------------------------- | ------- | -------- | +| **name** | `Text` | - | Yes | +| **label** | `Text` | - | No | +| **description** | `Text` | - | No | +| **timeout** | `Number` | 10 | No | +| **rebase** | `Boolean` | False | No | +| **query** | `Relationship`
CoreGraphQLQuery | - | Yes | +| **repository** | `Relationship`
CoreRepository | - | Yes | ==- -## Available Transformation +## Available transformations +{ class="compact" } | Namespace | Transformation | Description | Language | Output Format | -|-----------|---------------------|----------------------------------------|----------|---------------| +| --------- | ------------------- | -------------------------------------- | -------- | ------------- | | Core | **RFile** | A file rendered from a Jinja2 template | Jinja2 | Plain Text | | Core | **TransformPython** | A transform function written in Python | Python | JSON | ### RFile (Jinja2 Plugin) -An RFile is a Transformation plugin for Jinja2, it can generate any file in plain text format and must be composed of 1 main Jinja2 template and 1 GraphQL Query. +An RFile is a transformation plugin for Jinja2, it can generate any file in plain text format and must be composed of 1 main Jinja2 template and 1 GraphQL query. #### Create an RFile @@ -77,30 +79,35 @@ rfiles: #### Render an RFile An RFile can be rendered with 3 different methods: + - On demand via the REST API -- As part of an [Artifact](./artifact.md) +- As part of an [artifact](./artifact.md) - In CLI for development and troubleshooting ##### From the REST API -A RFile can be rendered on demand via the REST API with the endpoint : `https:///api/rfile/` +An RFile can be rendered on demand via the REST API with the endpoint: `https:///api/rfile/` + +This endpoint is branch-aware and it accepts the name of the branch and/or the time as URL parameters. -This endpoint is branch aware and it accept the name of the branch and/or the time in as a URL parameters - `https:///api/rfile/?branch=branch33` - `https:///api/rfile/?branch=branch33&at=