7f*RddlZddlZddlZddlZddlZddlZddlZddlm Z ddl m Z m Z m Z mZmZmZmZmZmZddlmZddlmZddlmZddlmZmZmZmZddlm Z m!Z!e rdd l"m#Z#ej$e%Z&d Z'e d Gd dZ(e d GddZ)dee e*e*fdee e*e*ffdZ+de*de*fdZ,de*de*fdZ-ej.dej/Z0de*de1de*fdZ2de*de*fdZ3ej4GddZ5Gdd eZ6d!e5de6fd"Z7ej8d#d$e5d%e5de1fd&Z9dS)'N) dataclass) TYPE_CHECKINGAnyDictListMapping NamedTupleOptionalTupleUnion) deprecated)WHEEL_EXTENSION)Hashes)pairwiseredact_auth_from_urlsplit_auth_from_netlocsplitext) path_to_url url_to_path) IndexContent)sha512sha384sha256sha224sha1md5T)frozencPeZdZUdZeed<eed<ejdd de DZ dd Z e ejd d ededfd ZdeeeffdZdefdZdeedefdZd S)LinkHashaLinks to content may have embedded hash values. This class parses those. `name` must be any member of `_SUPPORTED_HASHES`. This class can be converted to and from `ArchiveInfo`. While ArchiveInfo intends to be JSON-serializable to conform to PEP 610, this class contains the logic for parsing a hash name and value for correctness, and then checking whether that hash conforms to a schema with `.is_hash_allowed()`.namevaluez[#&]({choices})=([^&]*)|c#>K|]}tj|VdSN)reescape).0 hash_names y/builddir/build/BUILD/imunify360-venv-2.3.5/opt/imunify360/venv/lib/python3.11/site-packages/pip/_internal/models/link.py zLinkHash.@s,UUiRYy11UUUUUU)choicesreturnNc&|jtvsJdSr$)r _SUPPORTED_HASHESselfs r) __post_init__zLinkHash.__post_init__Dsy-------r+maxsizeurlc|j|}|dS|\}}|||S)zGSearch a string for a checksum algorithm name and encoded output value.Nr r!)_hash_url_fragment_researchgroups)clsr5matchr r!s r)find_hash_url_fragmentzLinkHash.find_hash_url_fragmentGsJ)0055 =4llnn esE****r+c|j|jiSr$r7r0s r)as_dictzLinkHash.as_dictQs 4:&&r+c:t|j|jgiS)z@Return a Hashes instance which checks only for the current hash.)rr r!r0s r) as_hasheszLinkHash.as_hashesTsty4:,/000r+hashescL|dS||j|jS)zI Return True if the current hash is allowed by `hashes`. NF) hex_digest)is_hash_allowedr r!r1rBs r)rEzLinkHash.is_hash_allowedXs+ >5%%diDJ%GGGr+r-N)__name__ __module__ __qualname____doc__str__annotations__r%compileformatjoinr/r8r2 classmethod functools lru_cacher r=rr?rrAboolrEr+r)rr+sP77 III JJJ&BJ #))HHUUCTUUUUU *   ....Y&&&++*1E+++'&[+'c3h''''161111Hhv&6H4HHHHHHr+rcBeZdZUdZeeeefed<ddZdS) MetadataFilezFInformation about a core metadata file associated with a distribution.rBr-NcX|j td|jDsJdSdS)Nc3(K|] }|tvVdSr$r/)r'r s r)r*z-MetadataFile.__post_init__..is(IITt00IIIIIIr+)rBallr0s r)r2zMetadataFile.__post_init__gs= ; "IIT[IIIII I II # " I Ir+rG) rHrIrJrKr rrLrMr2rUr+r)rWrWasOPP T#s(^ $$$$JJJJJJr+rWrBr-cR|dSd|D}|sdS|S)Nc,i|]\}}|tv||SrUrZ)r'nvs r) z$supported_hashes..qs) H H Htq!6G1G1Ga1G1G1Gr+)itemsrBs r)supported_hashesrcls;~t H Hv||~~ H H HF t Mr+partcztjtj|S)zP Clean a "part" of a URL path (i.e. after splitting on "@" characters). )urllibparsequoteunquoterds r)_clean_url_path_partrkws* <  fl22488 9 99r+cztjtj|S)z Clean the first part of a URL path that corresponds to a local filesystem path (i.e. the first part after splitting on "@" characters). )rfrequest pathname2url url2pathnamerjs r)_clean_file_url_pathrps* > & &v~'B'B4'H'H I IIr+z(@|%2F)path is_local_pathcb|rt}nt}t|}g}t t j|dgD]J\}}|||||Kd |S)z* Clean the path portion of a URL. ) rprk_reserved_chars_resplitr itertoolschainappendupperrP)rqrr clean_funcparts cleaned_partsto_cleanreserveds r)_clean_url_pathrs*) )   $ $T * *EM&yurd'C'CDD//(ZZ11222X^^--.... 77= ! !!r+r5ctj|}|j }t |j|}tj||S)z Make sure a link is fully quoted. For example, if ' ' occurs in the URL, it will be replaced with "%20", and without double-quoting other characters. )rr)rq)rfrgurlparsenetlocrrq urlunparse_replace)r5resultrrrqs r)_ensure_quoted_urlrs[\ " "3 ' 'F %M 6;m D D DD < " "6???#=#= > >>r+cleZdZdZgdZ d5dedeeedfdeed eed eed e d ee eefd dfdZ e de eefded edfdZe de eeefdeded edfdZd efdZd efdZd efdZded e fdZded e fdZed efdZed efdZed efdZed efdZed efdZed efd Zd eeeffd!Zed efd"Z ed efd#Z!e"j#d$Z$e"j#d%e"j%Z&d eefd&Z'e"j#d'Z(ed eefd(Z)d edfd)Z*d e+fd*Z,ed eefd+Z-ed eefd,Z.ed efd-Z/ed e fd.Z0d e fd/Z1ed e fd0Z2ed e fd1Z3ed e fd2Z4ed e fd3Z5d ee+d e fd4Z6dS)6Linkz:Represents a parsed link from a Package Index's simple URL) _parsed_url_url_hashes comes_fromrequires_python yanked_reasonmetadata_file_datacache_link_parsing egg_fragmentNTr5rrrrrrrBr-c|drt|}tj||_||_t|}|in| } || |_ n i|| |_ ||_ |r|nd|_ ||_ ||_||_||_dS)a :param url: url of the resource pointed to (href of the link) :param comes_from: instance of IndexContent where the link was found, or string. :param requires_python: String containing the `Requires-Python` metadata field, specified in PEP 345. This may be specified by a data-requires-python attribute in the HTML link tag, as described in PEP 503. :param yanked_reason: the reason the file has been yanked, if the file has been yanked, or None if the file hasn't been yanked. This is the value of the "data-yanked" attribute, if present, in a simple repository HTML link. If the file has been yanked but no reason was provided, this should be the empty string. See PEP 592 for more information and the specification. :param metadata_file_data: the metadata attached to the file, or None if no such metadata is provided. This argument, if not None, indicates that a separate metadata file exists, and also optionally supplies hashes for that file. :param cache_link_parsing: A flag that is used elsewhere to determine whether resources retrieved from this link should be cached. PyPI URLs should generally have this set to False, for example. :param hashes: A mapping of hash names to digests to allow us to determine the validity of a download. z\\N) startswithrrfrgurlsplitrrrr=r?rrrrrr _egg_fragmentr) r1r5rrrrrrB link_hashhashes_from_links r)__init__z Link.__init__sP >>& ! ! #c""C!<0055 33C88 !*!222 8I8I8K8K >+DLL9f9(89DL$2AKt*"4"4 ..00r+ file_datapage_urlcP|d}|dSttj||}|d}|d}|di}|d}||d}t |t rtt|} n|rtd} nd} |rt |tsd}n|sd}||||||| S) zZ Convert an pypi json document from a simple repository page into a Link. r5Nzrequires-pythonyankedrBz core-metadatazdist-info-metadatart)rrrrBr) getrrfrgurljoin isinstancedictrWrcrL) r;rrfile_urlr5 pyrequirerrB metadata_infors r) from_jsonzLink.from_jsonsI==''  4 !5!5h!I!IJJMM"344 ! h// x,," o66  %MM*>??M mT * * &!-.>}.M.M!N!N    &!-d!3!3  "&   !M3!?!? !MM ! Ms %'1     r+anchor_attribsbase_urlc`|d}|sdSttj||}|d}|d}|d}||d}|dkrt d} nm|d} nh|d\} } } | dkrt t| | i} n*t d |t d} |||||| S) z_ Convert an anchor element's attributes in a simple repository page to a Link. hrefNzdata-requires-pythonz data-yankedzdata-core-metadatazdata-dist-info-metadatatrue=z8Index returned invalid data-dist-info-metadata value: %s)rrrr) rrrfrgrrW partitionrcloggerdebug) r;rrrrr5rrrrhashnamesephashvals r) from_elementzLink.from_element7sa!!&)) 4 !5!5h!E!EFF"&&'=>> &**=99 '**+?@@  *../HIIM F " "!-d!3!3    "!%  &3%<%rUr0s r)__repr__z Link.__repr__usr+c*t|jSr$)hashr5r0s r)__hash__z Link.__hash__xsDH~~r+othercZt|tstS|j|jkSr$rrNotImplementedr5r1rs r)__eq__z Link.__eq__{s)%&& "! !x59$$r+cZt|tstS|j|jkSr$rrs r)__lt__z Link.__lt__s)%&& "! !x%)##r+c|jSr$)rr0s r)r5zLink.urls yr+c|jd}tj|}|st |j\}}|St j|}|sJd|j d|S)N/zURL z produced no filename) rqrstrip posixpathbasenamerrrfrgrir)r1rqr r user_passs r)filenamez Link.filenamesy$$!$'' !7t{ C C FIM|##D))>>>DI>>>>>t r+c*t|jSr$)rr5r0s r) file_pathzLink.file_paths48$$$r+c|jjSr$)rschemer0s r)rz Link.schemes&&r+c|jjS)z4 This can contain auth information. )rrr0s r)rz Link.netlocs &&r+cTtj|jjSr$)rfrgrirrqr0s r)rqz Link.paths|##D$4$9:::r+ctttj|jdS)Nr)rrrrqrr0s r)rz Link.splitexts+ *49+;+;C+@+@AABBBr+c6|dSN)rr0s r)extzLink.exts}}q!!r+cd|j\}}}}}tj||||dfS)Nrt)rrfrg urlunsplit)r1rrrqqueryfragments r)url_without_fragmentzLink.url_without_fragments6040@-eX|&&eR'HIIIr+z[#&]egg=([^&]*)z)^([A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9])$c|j|j}|sdS|d}|j|st |dddd|S)Nrz1 contains an egg fragment with a non-PEP 508 namez8to use the req @ url syntax, and remove the egg fragmentz25.0ia-)reason replacementgone_inissue)_egg_fragment_rer9rgroup_project_name_rer<r )r1r< project_names r)rzLink._egg_fragments%,,TY77 4{{1~~ $**<88  QQQV     r+z[#&]subdirectory=([^&]*)cr|j|j}|sdS|dSr)_subdirectory_fragment_rer9rr)r1r<s r)subdirectory_fragmentzLink.subdirectory_fragments7.55di@@ 4{{1~~r+c|jdS|jd}|jjt|St||jjS)z>>  " ) 1 %% %L)@)GHHHHr+cbtd|jDS)Nci|] \}}||g SrUrU)r'kr_s r)r`z"Link.as_hashes..s ???$!Qq1#???r+)rrrar0s r)rAzLink.as_hashess-??$,*<*<*>*>???@@@r+cjtt|jdSr$)nextiterrvaluesr0s r)rz Link.hashs(D,,..//666r+cFtt|jdSr$)rrrr0s r)r(zLink.hash_namesD&&---r+ctj|jddddddS)N#rr?)rrrrvr0s r)show_urlz Link.show_urls=!$)//#q"9"9!"<"B"B3"J"J1"MNNNr+c|jdkS)Nfile)rr0s r)is_filez Link.is_files{f$$r+cX|jo#tj|jSr$)rosrqisdirrr0s r)is_existing_dirzLink.is_existing_dirs|= dn = ==r+c"|jtkSr$)rrr0s r)is_wheelz Link.is_wheelsx?**r+c*ddlm}|j|jvS)Nr)vcs)pip._internal.vcsrr all_schemes)r1rs r)is_vcsz Link.is_vcss#)))))){co--r+c|jduSr$)rr0s r) is_yankedzLink.is_yankeds!--r+c*t|jSr$)rTrr0s r)has_hashz Link.has_hashsDL!!!r+cpdStfd|jDS)zS Return True if the link has a hash and it is allowed by `hashes`. NFc3JK|]\}}||VdSr$)rE)r'rr_rBs r)r*z'Link.is_hash_allowed..s7QQDAq6))!Q//QQQQQQr+)anyrrarFs `r)rEzLink.is_hash_allowed s@ >5QQQQDL0 0  &  0 0 0 [0 d1 S(3-/01 1  1 &  1 1 1 [1 f88888 #    #%C%D%%%% $C$D$$$$ SX #   X %3%%%X%''''X'''''X' ;c;;;X;C%S/CCCC"S"""X"JcJJJXJ"rz"455"rz4bmx}$!+ +F G G x}X Ix/IIIIA6AAAA7hsm777X7.8C=...X.O#OOOXO%%%%X%>>>>>+$+++X+....X. .4...X."$"""X"Rhv&6R4RRRRRRr+rc|eZdZUdZejjed<ee e e fed<e ed<ee e fed<dS) _CleanResultaConvert link for equivalency check. This is used in the resolver to check whether two URL-specified requirements likely point to the same distribution and can be considered equivalent. This equivalency logic avoids comparing URLs literally, which can be too strict (e.g. "a=1&b=2" vs "b=2&a=1") and produce conflicts unexpecting to users. Currently this does three things: 1. Drop the basic auth part. This is technically wrong since a server can serve different content based on auth, but if it does that, it is even impossible to guarantee two URLs without auth are equivalent, since the user can input different auth information when prompted. So the practical solution is to assume the auth doesn't affect the response. 2. Parse the query to avoid the ordering issue. Note that ordering under the same key in the query are NOT cleaned; i.e. "a=1&a=2" and "a=2&a=1" are still considered different. 3. Explicitly drop most of the fragment part, except ``subdirectory=`` and hash values, since it should have no impact the downloaded content. Note that this drops the "egg=" part historically used to denote the requested project (and extras), which is wrong in the strictest sense, but too many people are supplying it inconsistently to cause superfluous resolution conflicts, so we choose to also ignore them. parsedr subdirectoryrBN) rHrIrJrKrfrg SplitResultrMrrLrrUr+r)rrsh2 L $$$$ T#Y  cNr+rlinkc|j}|jddd}|jdkr|sd}tj|jdvrt d| dd }n#ttf$rd }YnwxYwfd tD}t||d d tj|j|| S)N@rr localhosteggzIgnoring egg= fragment in %srrrtc6i|]}|v||dS)rrU)r'rrs r)r`z_clean_link..Cs( L L LAa8mma!Qmmmr+)rrr)rrrrB)rrrsplitrrfrgparse_qsrrr IndexErrorKeyErrorr/rrr)rrrrrBrs @r) _clean_linkr!2s  F ] ! !#q ) )" -F }v|$$V_55H  3T::: /2  ! M L L L): L L LF fBDDl##FL11!    s=B B"!B"r3link1link2cBt|t|kSr$)r!)r"r#s r)links_equivalentr%Ls u  U!3!3 33r+):rRrwloggingrrr% urllib.parserf dataclassesrtypingrrrrrr r r r pip._internal.utils.deprecationr pip._internal.utils.filetypesrpip._internal.utils.hashesrpip._internal.utils.miscrrrrpip._internal.utils.urlsrrpip._internal.index.collectorr getLoggerrHrr/rrWrLrcrkrprNrrurTrrtotal_orderingrrr!rSr%rUr+r)r2s  !!!!!!                      766666999999------ >=======;::::::  8 $ $ L $2H2H2H2H2H2H2H2Hj $JJJJJJJJXd38n5(4S>:R:s:s:::: Js Js J J J J RZ 2=99"#"d"s"""", ?C ?C ? ? ? ? YRYRYRYRYRYRYRYRx :@d|4T"""4D44$444#"444r+