index
int64
0
731k
package
stringlengths
2
98
name
stringlengths
1
76
docstring
stringlengths
0
281k
code
stringlengths
4
1.07M
signature
stringlengths
2
42.8k
11,621
rarfile
read
Return uncompressed data for archive entry. For longer files using :meth:`~RarFile.open` may be better idea. Parameters: name filename or RarInfo instance pwd password to use for extracting.
def read(self, name, pwd=None): """Return uncompressed data for archive entry. For longer files using :meth:`~RarFile.open` may be better idea. Parameters: name filename or RarInfo instance pwd password to use for extracting. """ with self.open(name, "r", pwd) as f: return f.read()
(self, name, pwd=None)
11,622
rarfile
setpassword
Sets the password to use when extracting.
def setpassword(self, pwd): """Sets the password to use when extracting. """ self._password = pwd if self._file_parser: if self._file_parser.has_header_encryption(): self._file_parser = None if not self._file_parser: self._parse() else: self._file_parser.setpassword(self._password)
(self, pwd)
11,623
rarfile
strerror
Return error string if parsing failed or None if no problems.
def strerror(self): """Return error string if parsing failed or None if no problems. """ if not self._file_parser: return "Not a RAR file" return self._file_parser.strerror()
(self)
11,624
rarfile
testrar
Read all files and test CRC.
def testrar(self, pwd=None): """Read all files and test CRC. """ for member in self.infolist(): if member.is_file(): with self.open(member, 'r', pwd) as f: empty_read(f, member.file_size, BSIZE)
(self, pwd=None)
11,625
rarfile
volumelist
Returns filenames of archive volumes. In case of single-volume archive, the list contains just the name of main archive file.
def volumelist(self): """Returns filenames of archive volumes. In case of single-volume archive, the list contains just the name of main archive file. """ return self._file_parser.volumelist()
(self)
11,626
rarfile
RarInfo
An entry in rar archive. Timestamps as :class:`~datetime.datetime` are without timezone in RAR3, with UTC timezone in RAR5 archives. Attributes: filename File name with relative path. Path separator is "/". Always unicode string. date_time File modification timestamp. As tuple of (year, month, day, hour, minute, second). RAR5 allows archives where it is missing, it's None then. comment Optional file comment field. Unicode string. (RAR3-only) file_size Uncompressed size. compress_size Compressed size. compress_type Compression method: one of :data:`RAR_M0` .. :data:`RAR_M5` constants. extract_version Minimal Rar version needed for decompressing. As (major*10 + minor), so 2.9 is 29. RAR3: 10, 20, 29 RAR5 does not have such field in archive, it's simply set to 50. host_os Host OS type, one of RAR_OS_* constants. RAR3: :data:`RAR_OS_WIN32`, :data:`RAR_OS_UNIX`, :data:`RAR_OS_MSDOS`, :data:`RAR_OS_OS2`, :data:`RAR_OS_BEOS`. RAR5: :data:`RAR_OS_WIN32`, :data:`RAR_OS_UNIX`. mode File attributes. May be either dos-style or unix-style, depending on host_os. mtime File modification time. Same value as :attr:`date_time` but as :class:`~datetime.datetime` object with extended precision. ctime Optional time field: creation time. As :class:`~datetime.datetime` object. atime Optional time field: last access time. As :class:`~datetime.datetime` object. arctime Optional time field: archival time. As :class:`~datetime.datetime` object. (RAR3-only) CRC CRC-32 of uncompressed file, unsigned int. RAR5: may be None. blake2sp_hash Blake2SP hash over decompressed data. (RAR5-only) volume Volume nr, starting from 0. volume_file Volume file name, where file starts. file_redir If not None, file is link of some sort. Contains tuple of (type, flags, target). (RAR5-only) Type is one of constants: :data:`RAR5_XREDIR_UNIX_SYMLINK` Unix symlink. :data:`RAR5_XREDIR_WINDOWS_SYMLINK` Windows symlink. :data:`RAR5_XREDIR_WINDOWS_JUNCTION` Windows junction. :data:`RAR5_XREDIR_HARD_LINK` Hard link to target. :data:`RAR5_XREDIR_FILE_COPY` Current file is copy of another archive entry. Flags may contain bits: :data:`RAR5_XREDIR_ISDIR` Symlink points to directory.
class RarInfo: r"""An entry in rar archive. Timestamps as :class:`~datetime.datetime` are without timezone in RAR3, with UTC timezone in RAR5 archives. Attributes: filename File name with relative path. Path separator is "/". Always unicode string. date_time File modification timestamp. As tuple of (year, month, day, hour, minute, second). RAR5 allows archives where it is missing, it's None then. comment Optional file comment field. Unicode string. (RAR3-only) file_size Uncompressed size. compress_size Compressed size. compress_type Compression method: one of :data:`RAR_M0` .. :data:`RAR_M5` constants. extract_version Minimal Rar version needed for decompressing. As (major*10 + minor), so 2.9 is 29. RAR3: 10, 20, 29 RAR5 does not have such field in archive, it's simply set to 50. host_os Host OS type, one of RAR_OS_* constants. RAR3: :data:`RAR_OS_WIN32`, :data:`RAR_OS_UNIX`, :data:`RAR_OS_MSDOS`, :data:`RAR_OS_OS2`, :data:`RAR_OS_BEOS`. RAR5: :data:`RAR_OS_WIN32`, :data:`RAR_OS_UNIX`. mode File attributes. May be either dos-style or unix-style, depending on host_os. mtime File modification time. Same value as :attr:`date_time` but as :class:`~datetime.datetime` object with extended precision. ctime Optional time field: creation time. As :class:`~datetime.datetime` object. atime Optional time field: last access time. As :class:`~datetime.datetime` object. arctime Optional time field: archival time. As :class:`~datetime.datetime` object. (RAR3-only) CRC CRC-32 of uncompressed file, unsigned int. RAR5: may be None. blake2sp_hash Blake2SP hash over decompressed data. (RAR5-only) volume Volume nr, starting from 0. volume_file Volume file name, where file starts. file_redir If not None, file is link of some sort. Contains tuple of (type, flags, target). (RAR5-only) Type is one of constants: :data:`RAR5_XREDIR_UNIX_SYMLINK` Unix symlink. :data:`RAR5_XREDIR_WINDOWS_SYMLINK` Windows symlink. :data:`RAR5_XREDIR_WINDOWS_JUNCTION` Windows junction. :data:`RAR5_XREDIR_HARD_LINK` Hard link to target. :data:`RAR5_XREDIR_FILE_COPY` Current file is copy of another archive entry. Flags may contain bits: :data:`RAR5_XREDIR_ISDIR` Symlink points to directory. """ # zipfile-compatible fields filename = None file_size = None compress_size = None date_time = None CRC = None volume = None orig_filename = None # optional extended time fields, datetime() objects. mtime = None ctime = None atime = None extract_version = None mode = None host_os = None compress_type = None # rar3-only fields comment = None arctime = None # rar5-only fields blake2sp_hash = None file_redir = None # internal fields flags = 0 type = None # zipfile compat def is_dir(self): """Returns True if entry is a directory. .. versionadded:: 4.0 """ return False def is_symlink(self): """Returns True if entry is a symlink. .. versionadded:: 4.0 """ return False def is_file(self): """Returns True if entry is a normal file. .. versionadded:: 4.0 """ return False def needs_password(self): """Returns True if data is stored password-protected. """ if self.type == RAR_BLOCK_FILE: return (self.flags & RAR_FILE_PASSWORD) > 0 return False def isdir(self): """Returns True if entry is a directory. .. deprecated:: 4.0 """ return self.is_dir()
()
11,632
rarfile
RarLockedArchiveError
Must not modify locked archive
class RarLockedArchiveError(RarExecError): """Must not modify locked archive"""
null
11,633
rarfile
RarMemoryError
Memory error
class RarMemoryError(RarExecError): """Memory error"""
null
11,634
rarfile
RarNoFilesError
No files that match pattern were found
class RarNoFilesError(RarExecError): """No files that match pattern were found"""
null
11,635
rarfile
RarOpenError
Open error
class RarOpenError(RarExecError): """Open error"""
null
11,636
rarfile
RarSignalExit
Unrar exited with signal
class RarSignalExit(RarExecError): """Unrar exited with signal"""
null
11,637
rarfile
RarUnknownError
Unknown exit code
class RarUnknownError(RarExecError): """Unknown exit code"""
null
11,638
rarfile
RarUserBreak
User stop
class RarUserBreak(RarExecError): """User stop"""
null
11,639
rarfile
RarUserError
User error
class RarUserError(RarExecError): """User error"""
null
11,640
rarfile
RarWarning
Non-fatal error
class RarWarning(RarExecError): """Non-fatal error"""
null
11,641
rarfile
RarWriteError
Write error
class RarWriteError(RarExecError): """Write error"""
null
11,642
rarfile
RarWrongPassword
Incorrect password
class RarWrongPassword(RarExecError): """Incorrect password"""
null
11,643
_struct
Struct
Struct(fmt) --> compiled struct object
from _struct import Struct
null
11,644
rarfile
ToolSetup
null
class ToolSetup: def __init__(self, setup): self.setup = setup def check(self): cmdline = self.get_cmdline("check_cmd", None) try: p = custom_popen(cmdline) out, _ = p.communicate() return p.returncode == 0 except RarCannotExec: return False def open_cmdline(self, pwd, rarfn, filefn=None): cmdline = self.get_cmdline("open_cmd", pwd) cmdline.append(rarfn) if filefn: self.add_file_arg(cmdline, filefn) return cmdline def get_errmap(self): return self.setup["errmap"] def get_cmdline(self, key, pwd, nodash=False): cmdline = list(self.setup[key]) cmdline[0] = globals()[cmdline[0]] if key == "check_cmd": return cmdline self.add_password_arg(cmdline, pwd) if not nodash: cmdline.append("--") return cmdline def add_file_arg(self, cmdline, filename): cmdline.append(filename) def add_password_arg(self, cmdline, pwd): """Append password switch to commandline. """ if pwd is not None: if not isinstance(pwd, str): pwd = pwd.decode("utf8") args = self.setup["password"] if args is None: tool = self.setup["open_cmd"][0] raise RarCannotExec(f"{tool} does not support passwords") elif isinstance(args, str): cmdline.append(args + pwd) else: cmdline.extend(args) cmdline.append(pwd) else: cmdline.extend(self.setup["no_password"])
(setup)
11,645
rarfile
__init__
null
def __init__(self, setup): self.setup = setup
(self, setup)
11,646
rarfile
add_file_arg
null
def add_file_arg(self, cmdline, filename): cmdline.append(filename)
(self, cmdline, filename)
11,647
rarfile
add_password_arg
Append password switch to commandline.
def add_password_arg(self, cmdline, pwd): """Append password switch to commandline. """ if pwd is not None: if not isinstance(pwd, str): pwd = pwd.decode("utf8") args = self.setup["password"] if args is None: tool = self.setup["open_cmd"][0] raise RarCannotExec(f"{tool} does not support passwords") elif isinstance(args, str): cmdline.append(args + pwd) else: cmdline.extend(args) cmdline.append(pwd) else: cmdline.extend(self.setup["no_password"])
(self, cmdline, pwd)
11,648
rarfile
check
null
def check(self): cmdline = self.get_cmdline("check_cmd", None) try: p = custom_popen(cmdline) out, _ = p.communicate() return p.returncode == 0 except RarCannotExec: return False
(self)
11,649
rarfile
get_cmdline
null
def get_cmdline(self, key, pwd, nodash=False): cmdline = list(self.setup[key]) cmdline[0] = globals()[cmdline[0]] if key == "check_cmd": return cmdline self.add_password_arg(cmdline, pwd) if not nodash: cmdline.append("--") return cmdline
(self, key, pwd, nodash=False)
11,650
rarfile
get_errmap
null
def get_errmap(self): return self.setup["errmap"]
(self)
11,651
rarfile
open_cmdline
null
def open_cmdline(self, pwd, rarfn, filefn=None): cmdline = self.get_cmdline("open_cmd", pwd) cmdline.append(rarfn) if filefn: self.add_file_arg(cmdline, filefn) return cmdline
(self, pwd, rarfn, filefn=None)
11,652
rarfile
UnicodeFilename
Handle RAR3 unicode filename decompression.
class UnicodeFilename: """Handle RAR3 unicode filename decompression. """ def __init__(self, name, encdata): self.std_name = bytearray(name) self.encdata = bytearray(encdata) self.pos = self.encpos = 0 self.buf = bytearray() self.failed = 0 def enc_byte(self): """Copy encoded byte.""" try: c = self.encdata[self.encpos] self.encpos += 1 return c except IndexError: self.failed = 1 return 0 def std_byte(self): """Copy byte from 8-bit representation.""" try: return self.std_name[self.pos] except IndexError: self.failed = 1 return ord("?") def put(self, lo, hi): """Copy 16-bit value to result.""" self.buf.append(lo) self.buf.append(hi) self.pos += 1 def decode(self): """Decompress compressed UTF16 value.""" hi = self.enc_byte() flagbits = 0 while self.encpos < len(self.encdata): if flagbits == 0: flags = self.enc_byte() flagbits = 8 flagbits -= 2 t = (flags >> flagbits) & 3 if t == 0: self.put(self.enc_byte(), 0) elif t == 1: self.put(self.enc_byte(), hi) elif t == 2: self.put(self.enc_byte(), self.enc_byte()) else: n = self.enc_byte() if n & 0x80: c = self.enc_byte() for _ in range((n & 0x7f) + 2): lo = (self.std_byte() + c) & 0xFF self.put(lo, hi) else: for _ in range(n + 2): self.put(self.std_byte(), 0) return self.buf.decode("utf-16le", "replace")
(name, encdata)
11,653
rarfile
__init__
null
def __init__(self, name, encdata): self.std_name = bytearray(name) self.encdata = bytearray(encdata) self.pos = self.encpos = 0 self.buf = bytearray() self.failed = 0
(self, name, encdata)
11,654
rarfile
decode
Decompress compressed UTF16 value.
def decode(self): """Decompress compressed UTF16 value.""" hi = self.enc_byte() flagbits = 0 while self.encpos < len(self.encdata): if flagbits == 0: flags = self.enc_byte() flagbits = 8 flagbits -= 2 t = (flags >> flagbits) & 3 if t == 0: self.put(self.enc_byte(), 0) elif t == 1: self.put(self.enc_byte(), hi) elif t == 2: self.put(self.enc_byte(), self.enc_byte()) else: n = self.enc_byte() if n & 0x80: c = self.enc_byte() for _ in range((n & 0x7f) + 2): lo = (self.std_byte() + c) & 0xFF self.put(lo, hi) else: for _ in range(n + 2): self.put(self.std_byte(), 0) return self.buf.decode("utf-16le", "replace")
(self)
11,655
rarfile
enc_byte
Copy encoded byte.
def enc_byte(self): """Copy encoded byte.""" try: c = self.encdata[self.encpos] self.encpos += 1 return c except IndexError: self.failed = 1 return 0
(self)
11,656
rarfile
put
Copy 16-bit value to result.
def put(self, lo, hi): """Copy 16-bit value to result.""" self.buf.append(lo) self.buf.append(hi) self.pos += 1
(self, lo, hi)
11,657
rarfile
std_byte
Copy byte from 8-bit representation.
def std_byte(self): """Copy byte from 8-bit representation.""" try: return self.std_name[self.pos] except IndexError: self.failed = 1 return ord("?")
(self)
11,658
rarfile
UnsupportedWarning
Archive uses feature that are unsupported by rarfile. .. versionadded:: 4.0
class UnsupportedWarning(UserWarning): """Archive uses feature that are unsupported by rarfile. .. versionadded:: 4.0 """
null
11,659
rarfile
XFile
Input may be filename or file object.
class XFile: """Input may be filename or file object. """ __slots__ = ("_fd", "_need_close") def __init__(self, xfile, bufsize=1024): if is_filelike(xfile): self._need_close = False self._fd = xfile self._fd.seek(0) else: self._need_close = True self._fd = open(xfile, "rb", bufsize) def read(self, n=None): """Read from file.""" return self._fd.read(n) def tell(self): """Return file pos.""" return self._fd.tell() def seek(self, ofs, whence=0): """Move file pos.""" return self._fd.seek(ofs, whence) def readinto(self, buf): """Read into buffer.""" return self._fd.readinto(buf) def close(self): """Close file object.""" if self._need_close: self._fd.close() def __enter__(self): return self def __exit__(self, typ, val, tb): self.close()
(xfile, bufsize=1024)
11,661
rarfile
__exit__
null
def __exit__(self, typ, val, tb): self.close()
(self, typ, val, tb)
11,662
rarfile
__init__
null
def __init__(self, xfile, bufsize=1024): if is_filelike(xfile): self._need_close = False self._fd = xfile self._fd.seek(0) else: self._need_close = True self._fd = open(xfile, "rb", bufsize)
(self, xfile, bufsize=1024)
11,663
rarfile
close
Close file object.
def close(self): """Close file object.""" if self._need_close: self._fd.close()
(self)
11,664
rarfile
read
Read from file.
def read(self, n=None): """Read from file.""" return self._fd.read(n)
(self, n=None)
11,665
rarfile
readinto
Read into buffer.
def readinto(self, buf): """Read into buffer.""" return self._fd.readinto(buf)
(self, buf)
11,666
rarfile
seek
Move file pos.
def seek(self, ofs, whence=0): """Move file pos.""" return self._fd.seek(ofs, whence)
(self, ofs, whence=0)
11,667
rarfile
tell
Return file pos.
def tell(self): """Return file pos.""" return self._fd.tell()
(self)
11,668
rarfile
_find_sfx_header
null
def _find_sfx_header(xfile): sig = RAR_ID[:-1] buf = io.BytesIO() steps = (64, SFX_MAX_SIZE) with XFile(xfile) as fd: for step in steps: data = fd.read(step) if not data: break buf.write(data) curdata = buf.getvalue() findpos = 0 while True: pos = curdata.find(sig, findpos) if pos < 0: break if curdata[pos:pos + len(RAR_ID)] == RAR_ID: return RAR_V3, pos if curdata[pos:pos + len(RAR5_ID)] == RAR5_ID: return RAR_V5, pos findpos = pos + len(sig) return 0, 0
(xfile)
11,669
rarfile
_inc_volname
increase digits with carry, otherwise just increment char
def _inc_volname(volfile, i, inc_chars): """increase digits with carry, otherwise just increment char """ fn = list(volfile) while i >= 0: if fn[i] == "9": fn[i] = "0" i -= 1 if i < 0: fn.insert(0, "1") elif "0" <= fn[i] < "9" or inc_chars: fn[i] = chr(ord(fn[i]) + 1) break else: fn.insert(i + 1, "1") break return "".join(fn)
(volfile, i, inc_chars)
11,670
rarfile
_next_newvol
New-style next volume
def _next_newvol(volfile): """New-style next volume """ name, ext = os.path.splitext(volfile) if ext.lower() in ("", ".exe", ".sfx"): volfile = name + ".rar" i = len(volfile) - 1 while i >= 0: if "0" <= volfile[i] <= "9": return _inc_volname(volfile, i, False) if volfile[i] in ("/", os.sep): break i -= 1 raise BadRarName("Cannot construct volume name: " + volfile)
(volfile)
11,671
rarfile
_next_oldvol
Old-style next volume
def _next_oldvol(volfile): """Old-style next volume """ name, ext = os.path.splitext(volfile) if ext.lower() in ("", ".exe", ".sfx"): ext = ".rar" sfx = ext[2:] if _rc_num.match(sfx): ext = _inc_volname(ext, len(ext) - 1, True) else: # .rar -> .r00 ext = ext[:2] + "00" return name + ext
(volfile)
11,672
rarfile
_parse_ext_time
Parse all RAR3 extended time fields
def _parse_ext_time(h, data, pos): """Parse all RAR3 extended time fields """ # flags and rest of data can be missing flags = 0 if pos + 2 <= len(data): flags = S_SHORT.unpack_from(data, pos)[0] pos += 2 mtime, pos = _parse_xtime(flags >> 3 * 4, data, pos, h.mtime) h.ctime, pos = _parse_xtime(flags >> 2 * 4, data, pos) h.atime, pos = _parse_xtime(flags >> 1 * 4, data, pos) h.arctime, pos = _parse_xtime(flags >> 0 * 4, data, pos) if mtime: h.mtime = mtime h.date_time = mtime.timetuple()[:6] return pos
(h, data, pos)
11,673
rarfile
_parse_xtime
Parse one RAR3 extended time field
def _parse_xtime(flag, data, pos, basetime=None): """Parse one RAR3 extended time field """ res = None if flag & 8: if not basetime: basetime, pos = load_dostime(data, pos) # load second fractions of 100ns units rem = 0 cnt = flag & 3 for _ in range(cnt): b, pos = load_byte(data, pos) rem = (b << 16) | (rem >> 8) # dostime has room for 30 seconds only, correct if needed if flag & 4 and basetime.second < 59: basetime = basetime.replace(second=basetime.second + 1) res = to_nsdatetime(basetime, rem * 100) return res, pos
(flag, data, pos, basetime=None)
11,674
_blake2
blake2s
Return a new BLAKE2s hash object.
from _blake2 import blake2s
(data=b'', /, *, digest_size=32, key=b'', salt=b'', person=b'', fanout=1, depth=1, leaf_size=0, node_offset=0, node_depth=0, inner_size=0, last_node=False, usedforsecurity=True)
11,675
rarfile
check_returncode
Raise exception according to unrar exit code.
def check_returncode(code, out, errmap): """Raise exception according to unrar exit code. """ if code == 0: return if code > 0 and code < len(errmap): exc = errmap[code] elif code == 255: exc = RarUserBreak elif code < 0: exc = RarSignalExit else: exc = RarUnknownError # format message if out: msg = "%s [%d]: %s" % (exc.__doc__, code, out) else: msg = "%s [%d]" % (exc.__doc__, code) raise exc(msg)
(code, out, errmap)
11,676
rarfile
custom_popen
Disconnect cmd from parent fds, read only from stdout.
def custom_popen(cmd): """Disconnect cmd from parent fds, read only from stdout. """ creationflags = 0x08000000 if WIN32 else 0 # CREATE_NO_WINDOW try: p = Popen(cmd, bufsize=0, stdout=PIPE, stderr=STDOUT, stdin=DEVNULL, creationflags=creationflags) except OSError as ex: if ex.errno == errno.ENOENT: raise RarCannotExec("Unrar not installed?") from None if ex.errno == errno.EACCES or ex.errno == errno.EPERM: raise RarCannotExec("Cannot execute unrar") from None raise return p
(cmd)
11,678
rarfile
empty_read
Read and drop fixed amount of data.
def empty_read(src, size, blklen): """Read and drop fixed amount of data. """ while size > 0: if size > blklen: res = src.read(blklen) else: res = src.read(size) if not res: raise BadRarFile("cannot load data") size -= len(res)
(src, size, blklen)
11,680
rarfile
get_rar_version
Check quickly whether file is rar archive.
def get_rar_version(xfile): """Check quickly whether file is rar archive. """ with XFile(xfile) as fd: buf = fd.read(len(RAR5_ID)) if buf.startswith(RAR_ID): return RAR_V3 elif buf.startswith(RAR5_ID): return RAR_V5 return 0
(xfile)
11,682
rarfile
is_filelike
Filename or file object?
def is_filelike(obj): """Filename or file object? """ if isinstance(obj, (bytes, str, Path)): return False res = True for a in ("read", "tell", "seek"): res = res and hasattr(obj, a) if not res: raise ValueError("Invalid object passed as file") return True
(obj)
11,683
rarfile
is_rarfile
Check quickly whether file is rar archive.
def is_rarfile(xfile): """Check quickly whether file is rar archive. """ try: return get_rar_version(xfile) > 0 except OSError: # File not found or not accessible, ignore return False
(xfile)
11,684
rarfile
is_rarfile_sfx
Check whether file is rar archive with support for SFX. It will read 2M from file.
def is_rarfile_sfx(xfile): """Check whether file is rar archive with support for SFX. It will read 2M from file. """ return _find_sfx_header(xfile)[0] > 0
(xfile)
11,685
rarfile
load_byte
Load single byte
def load_byte(buf, pos): """Load single byte""" end = pos + 1 if end > len(buf): raise BadRarFile("cannot load byte") return S_BYTE.unpack_from(buf, pos)[0], end
(buf, pos)
11,686
rarfile
load_bytes
Load sequence of bytes
def load_bytes(buf, num, pos): """Load sequence of bytes""" end = pos + num if end > len(buf): raise BadRarFile("cannot load bytes") return buf[pos: end], end
(buf, num, pos)
11,687
rarfile
load_dostime
Load LE32 dos timestamp
def load_dostime(buf, pos): """Load LE32 dos timestamp""" stamp, pos = load_le32(buf, pos) tup = parse_dos_time(stamp) return to_datetime(tup), pos
(buf, pos)
11,688
rarfile
load_le32
Load little-endian 32-bit integer
def load_le32(buf, pos): """Load little-endian 32-bit integer""" end = pos + 4 if end > len(buf): raise BadRarFile("cannot load le32") return S_LONG.unpack_from(buf, pos)[0], end
(buf, pos)
11,689
rarfile
load_unixtime
Load LE32 unix timestamp
def load_unixtime(buf, pos): """Load LE32 unix timestamp""" secs, pos = load_le32(buf, pos) dt = datetime.fromtimestamp(secs, timezone.utc) return dt, pos
(buf, pos)
11,690
rarfile
load_vint
Load RAR5 variable-size int.
def load_vint(buf, pos): """Load RAR5 variable-size int.""" limit = min(pos + 11, len(buf)) res = ofs = 0 while pos < limit: b = buf[pos] res += ((b & 0x7F) << ofs) pos += 1 ofs += 7 if b < 0x80: return res, pos raise BadRarFile("cannot load vint")
(buf, pos)
11,691
rarfile
load_vstr
Load bytes prefixed by vint length
def load_vstr(buf, pos): """Load bytes prefixed by vint length""" slen, pos = load_vint(buf, pos) return load_bytes(buf, slen, pos)
(buf, pos)
11,692
rarfile
load_windowstime
Load LE64 windows timestamp
def load_windowstime(buf, pos): """Load LE64 windows timestamp""" # unix epoch (1970) in seconds from windows epoch (1601) unix_epoch = 11644473600 val1, pos = load_le32(buf, pos) val2, pos = load_le32(buf, pos) secs, n1secs = divmod((val2 << 32) | val1, 10000000) dt = datetime.fromtimestamp(secs - unix_epoch, timezone.utc) dt = to_nsdatetime(dt, n1secs * 100) return dt, pos
(buf, pos)
11,693
rarfile
main
Minimal command-line interface for rarfile module.
def main(args): """Minimal command-line interface for rarfile module. """ import argparse p = argparse.ArgumentParser(description=main.__doc__) g = p.add_mutually_exclusive_group(required=True) g.add_argument("-l", "--list", metavar="<rarfile>", help="Show archive listing") g.add_argument("-e", "--extract", nargs=2, metavar=("<rarfile>", "<output_dir>"), help="Extract archive into target dir") g.add_argument("-t", "--test", metavar="<rarfile>", help="Test if a archive is valid") cmd = p.parse_args(args) if cmd.list: with RarFile(cmd.list) as rf: rf.printdir() elif cmd.test: with RarFile(cmd.test) as rf: rf.testrar() elif cmd.extract: with RarFile(cmd.extract[0]) as rf: rf.extractall(cmd.extract[1])
(args)
11,694
rarfile
membuf_tempfile
Write in-memory file object to real file.
def membuf_tempfile(memfile): """Write in-memory file object to real file. """ memfile.seek(0, 0) tmpfd, tmpname = mkstemp(suffix=".rar", dir=HACK_TMP_DIR) tmpf = os.fdopen(tmpfd, "wb") try: shutil.copyfileobj(memfile, tmpf, BSIZE) tmpf.close() except BaseException: tmpf.close() os.unlink(tmpname) raise return tmpname
(memfile)
11,695
tempfile
mkstemp
User-callable function to create and return a unique temporary file. The return value is a pair (fd, name) where fd is the file descriptor returned by os.open, and name is the filename. If 'suffix' is not None, the file name will end with that suffix, otherwise there will be no suffix. If 'prefix' is not None, the file name will begin with that prefix, otherwise a default prefix is used. If 'dir' is not None, the file will be created in that directory, otherwise a default directory is used. If 'text' is specified and true, the file is opened in text mode. Else (the default) the file is opened in binary mode. If any of 'suffix', 'prefix' and 'dir' are not None, they must be the same type. If they are bytes, the returned name will be bytes; str otherwise. The file is readable and writable only by the creating user ID. If the operating system uses permission bits to indicate whether a file is executable, the file is executable by no one. The file descriptor is not inherited by children of this process. Caller is responsible for deleting the file when done with it.
def mkstemp(suffix=None, prefix=None, dir=None, text=False): """User-callable function to create and return a unique temporary file. The return value is a pair (fd, name) where fd is the file descriptor returned by os.open, and name is the filename. If 'suffix' is not None, the file name will end with that suffix, otherwise there will be no suffix. If 'prefix' is not None, the file name will begin with that prefix, otherwise a default prefix is used. If 'dir' is not None, the file will be created in that directory, otherwise a default directory is used. If 'text' is specified and true, the file is opened in text mode. Else (the default) the file is opened in binary mode. If any of 'suffix', 'prefix' and 'dir' are not None, they must be the same type. If they are bytes, the returned name will be bytes; str otherwise. The file is readable and writable only by the creating user ID. If the operating system uses permission bits to indicate whether a file is executable, the file is executable by no one. The file descriptor is not inherited by children of this process. Caller is responsible for deleting the file when done with it. """ prefix, suffix, dir, output_type = _sanitize_params(prefix, suffix, dir) if text: flags = _text_openflags else: flags = _bin_openflags return _mkstemp_inner(dir, prefix, suffix, flags, output_type)
(suffix=None, prefix=None, dir=None, text=False)
11,696
rarfile
nsdatetime
Datetime that carries nanoseconds. Arithmetic operations will lose nanoseconds. .. versionadded:: 4.0
class nsdatetime(datetime): """Datetime that carries nanoseconds. Arithmetic operations will lose nanoseconds. .. versionadded:: 4.0 """ __slots__ = ("nanosecond",) nanosecond: int #: Number of nanoseconds, 0 <= nanosecond <= 999999999 def __new__(cls, year, month=None, day=None, hour=0, minute=0, second=0, microsecond=0, tzinfo=None, *, fold=0, nanosecond=0): usec, mod = divmod(nanosecond, 1000) if nanosecond else (microsecond, 0) if mod == 0: return datetime(year, month, day, hour, minute, second, usec, tzinfo, fold=fold) self = super().__new__(cls, year, month, day, hour, minute, second, usec, tzinfo, fold=fold) self.nanosecond = nanosecond return self def isoformat(self, sep="T", timespec="auto"): """Formats with nanosecond precision by default. """ if timespec == "auto": pre, post = super().isoformat(sep, "microseconds").split(".", 1) return f"{pre}.{self.nanosecond:09d}{post[6:]}" return super().isoformat(sep, timespec) def astimezone(self, tz=None): """Convert to new timezone. """ tmp = super().astimezone(tz) return self.__class__(tmp.year, tmp.month, tmp.day, tmp.hour, tmp.minute, tmp.second, nanosecond=self.nanosecond, tzinfo=tmp.tzinfo, fold=tmp.fold) def replace(self, year=None, month=None, day=None, hour=None, minute=None, second=None, microsecond=None, tzinfo=None, *, fold=None, nanosecond=None): """Return new timestamp with specified fields replaced. """ return self.__class__( self.year if year is None else year, self.month if month is None else month, self.day if day is None else day, self.hour if hour is None else hour, self.minute if minute is None else minute, self.second if second is None else second, nanosecond=((self.nanosecond if microsecond is None else microsecond * 1000) if nanosecond is None else nanosecond), tzinfo=self.tzinfo if tzinfo is None else tzinfo, fold=self.fold if fold is None else fold) def __hash__(self): return hash((super().__hash__(), self.nanosecond)) if self.nanosecond else super().__hash__() def __eq__(self, other): return super().__eq__(other) and self.nanosecond == ( other.nanosecond if isinstance(other, nsdatetime) else other.microsecond * 1000) def __gt__(self, other): return super().__gt__(other) or (super().__eq__(other) and self.nanosecond > ( other.nanosecond if isinstance(other, nsdatetime) else other.microsecond * 1000)) def __lt__(self, other): return not (self > other or self == other) def __ge__(self, other): return not self < other def __le__(self, other): return not self > other def __ne__(self, other): return not self == other
(year, month=None, day=None, hour=0, minute=0, second=0, microsecond=0, tzinfo=None, *, fold=0, nanosecond: int = 0)
11,697
rarfile
__eq__
null
def __eq__(self, other): return super().__eq__(other) and self.nanosecond == ( other.nanosecond if isinstance(other, nsdatetime) else other.microsecond * 1000)
(self, other)
11,698
rarfile
__ge__
null
def __ge__(self, other): return not self < other
(self, other)
11,699
rarfile
__gt__
null
def __gt__(self, other): return super().__gt__(other) or (super().__eq__(other) and self.nanosecond > ( other.nanosecond if isinstance(other, nsdatetime) else other.microsecond * 1000))
(self, other)
11,700
rarfile
__hash__
null
def __hash__(self): return hash((super().__hash__(), self.nanosecond)) if self.nanosecond else super().__hash__()
(self)
11,701
rarfile
__le__
null
def __le__(self, other): return not self > other
(self, other)
11,702
rarfile
__lt__
null
def __lt__(self, other): return not (self > other or self == other)
(self, other)
11,703
rarfile
__ne__
null
def __ne__(self, other): return not self == other
(self, other)
11,704
rarfile
__new__
null
def __new__(cls, year, month=None, day=None, hour=0, minute=0, second=0, microsecond=0, tzinfo=None, *, fold=0, nanosecond=0): usec, mod = divmod(nanosecond, 1000) if nanosecond else (microsecond, 0) if mod == 0: return datetime(year, month, day, hour, minute, second, usec, tzinfo, fold=fold) self = super().__new__(cls, year, month, day, hour, minute, second, usec, tzinfo, fold=fold) self.nanosecond = nanosecond return self
(cls, year, month=None, day=None, hour=0, minute=0, second=0, microsecond=0, tzinfo=None, *, fold=0, nanosecond=0)
11,705
rarfile
astimezone
Convert to new timezone.
def astimezone(self, tz=None): """Convert to new timezone. """ tmp = super().astimezone(tz) return self.__class__(tmp.year, tmp.month, tmp.day, tmp.hour, tmp.minute, tmp.second, nanosecond=self.nanosecond, tzinfo=tmp.tzinfo, fold=tmp.fold)
(self, tz=None)
11,706
rarfile
isoformat
Formats with nanosecond precision by default.
def isoformat(self, sep="T", timespec="auto"): """Formats with nanosecond precision by default. """ if timespec == "auto": pre, post = super().isoformat(sep, "microseconds").split(".", 1) return f"{pre}.{self.nanosecond:09d}{post[6:]}" return super().isoformat(sep, timespec)
(self, sep='T', timespec='auto')
11,707
rarfile
replace
Return new timestamp with specified fields replaced.
def replace(self, year=None, month=None, day=None, hour=None, minute=None, second=None, microsecond=None, tzinfo=None, *, fold=None, nanosecond=None): """Return new timestamp with specified fields replaced. """ return self.__class__( self.year if year is None else year, self.month if month is None else month, self.day if day is None else day, self.hour if hour is None else hour, self.minute if minute is None else minute, self.second if second is None else second, nanosecond=((self.nanosecond if microsecond is None else microsecond * 1000) if nanosecond is None else nanosecond), tzinfo=self.tzinfo if tzinfo is None else tzinfo, fold=self.fold if fold is None else fold)
(self, year=None, month=None, day=None, hour=None, minute=None, second=None, microsecond=None, tzinfo=None, *, fold=None, nanosecond=None)
11,709
rarfile
parse_dos_time
Parse standard 32-bit DOS timestamp.
def parse_dos_time(stamp): """Parse standard 32-bit DOS timestamp. """ sec, stamp = stamp & 0x1F, stamp >> 5 mn, stamp = stamp & 0x3F, stamp >> 6 hr, stamp = stamp & 0x1F, stamp >> 5 day, stamp = stamp & 0x1F, stamp >> 5 mon, stamp = stamp & 0x0F, stamp >> 4 yr = (stamp & 0x7F) + 1980 return (yr, mon, day, hr, mn, sec * 2)
(stamp)
11,710
rarfile
rar3_decompress
Decompress blob of compressed data. Used for data with non-standard header - eg. comments.
def rar3_decompress(vers, meth, data, declen=0, flags=0, crc=0, pwd=None, salt=None): """Decompress blob of compressed data. Used for data with non-standard header - eg. comments. """ # already uncompressed? if meth == RAR_M0 and (flags & RAR_FILE_PASSWORD) == 0: return data # take only necessary flags flags = flags & (RAR_FILE_PASSWORD | RAR_FILE_SALT | RAR_FILE_DICTMASK) flags |= RAR_LONG_BLOCK # file header fname = b"data" date = ((2010 - 1980) << 25) + (12 << 21) + (31 << 16) mode = DOS_MODE_ARCHIVE fhdr = S_FILE_HDR.pack(len(data), declen, RAR_OS_MSDOS, crc, date, vers, meth, len(fname), mode) fhdr += fname if salt: fhdr += salt # full header hlen = S_BLK_HDR.size + len(fhdr) hdr = S_BLK_HDR.pack(0, RAR_BLOCK_FILE, flags, hlen) + fhdr hcrc = crc32(hdr[2:]) & 0xFFFF hdr = S_BLK_HDR.pack(hcrc, RAR_BLOCK_FILE, flags, hlen) + fhdr # archive main header mh = S_BLK_HDR.pack(0x90CF, RAR_BLOCK_MAIN, 0, 13) + b"\0" * (2 + 4) # decompress via temp rar setup = tool_setup() tmpfd, tmpname = mkstemp(suffix=".rar", dir=HACK_TMP_DIR) tmpf = os.fdopen(tmpfd, "wb") try: tmpf.write(RAR_ID + mh + hdr + data) tmpf.close() curpwd = (flags & RAR_FILE_PASSWORD) and pwd or None cmd = setup.open_cmdline(curpwd, tmpname) p = custom_popen(cmd) return p.communicate()[0] finally: tmpf.close() os.unlink(tmpname)
(vers, meth, data, declen=0, flags=0, crc=0, pwd=None, salt=None)
11,711
rarfile
rar3_s2k
String-to-key hash for RAR3.
def rar3_s2k(pwd, salt): """String-to-key hash for RAR3. """ if not isinstance(pwd, str): pwd = pwd.decode("utf8") seed = bytearray(pwd.encode("utf-16le") + salt) h = Rar3Sha1(rarbug=True) iv = b"" for i in range(16): for j in range(0x4000): cnt = S_LONG.pack(i * 0x4000 + j) h.update(seed) h.update(cnt[:3]) if j == 0: iv += h.digest()[19:20] key_be = h.digest()[:16] key_le = pack("<LLLL", *unpack(">LLLL", key_be)) return key_le, iv
(pwd, salt)
11,713
rarfile
sanitize_filename
Make filename safe for write access.
def sanitize_filename(fname, pathsep, is_win32): """Make filename safe for write access. """ if is_win32: if len(fname) > 1 and fname[1] == ":": fname = fname[2:] rc = RC_BAD_CHARS_WIN32 else: rc = RC_BAD_CHARS_UNIX if rc.search(fname): fname = rc.sub("_", fname) parts = [] for seg in fname.split("/"): if seg in ("", ".", ".."): continue if is_win32 and seg[-1] in (" ", "."): seg = seg[:-1] + "_" parts.append(seg) return pathsep.join(parts)
(fname, pathsep, is_win32)
11,717
datetime
timezone
Fixed offset from UTC implementation of tzinfo.
class timezone(tzinfo): __slots__ = '_offset', '_name' # Sentinel value to disallow None _Omitted = object() def __new__(cls, offset, name=_Omitted): if not isinstance(offset, timedelta): raise TypeError("offset must be a timedelta") if name is cls._Omitted: if not offset: return cls.utc name = None elif not isinstance(name, str): raise TypeError("name must be a string") if not cls._minoffset <= offset <= cls._maxoffset: raise ValueError("offset must be a timedelta " "strictly between -timedelta(hours=24) and " "timedelta(hours=24).") return cls._create(offset, name) @classmethod def _create(cls, offset, name=None): self = tzinfo.__new__(cls) self._offset = offset self._name = name return self def __getinitargs__(self): """pickle support""" if self._name is None: return (self._offset,) return (self._offset, self._name) def __eq__(self, other): if isinstance(other, timezone): return self._offset == other._offset return NotImplemented def __hash__(self): return hash(self._offset) def __repr__(self): """Convert to formal string, for repr(). >>> tz = timezone.utc >>> repr(tz) 'datetime.timezone.utc' >>> tz = timezone(timedelta(hours=-5), 'EST') >>> repr(tz) "datetime.timezone(datetime.timedelta(-1, 68400), 'EST')" """ if self is self.utc: return 'datetime.timezone.utc' if self._name is None: return "%s.%s(%r)" % (self.__class__.__module__, self.__class__.__qualname__, self._offset) return "%s.%s(%r, %r)" % (self.__class__.__module__, self.__class__.__qualname__, self._offset, self._name) def __str__(self): return self.tzname(None) def utcoffset(self, dt): if isinstance(dt, datetime) or dt is None: return self._offset raise TypeError("utcoffset() argument must be a datetime instance" " or None") def tzname(self, dt): if isinstance(dt, datetime) or dt is None: if self._name is None: return self._name_from_offset(self._offset) return self._name raise TypeError("tzname() argument must be a datetime instance" " or None") def dst(self, dt): if isinstance(dt, datetime) or dt is None: return None raise TypeError("dst() argument must be a datetime instance" " or None") def fromutc(self, dt): if isinstance(dt, datetime): if dt.tzinfo is not self: raise ValueError("fromutc: dt.tzinfo " "is not self") return dt + self._offset raise TypeError("fromutc() argument must be a datetime instance" " or None") _maxoffset = timedelta(hours=24, microseconds=-1) _minoffset = -_maxoffset @staticmethod def _name_from_offset(delta): if not delta: return 'UTC' if delta < timedelta(0): sign = '-' delta = -delta else: sign = '+' hours, rest = divmod(delta, timedelta(hours=1)) minutes, rest = divmod(rest, timedelta(minutes=1)) seconds = rest.seconds microseconds = rest.microseconds if microseconds: return (f'UTC{sign}{hours:02d}:{minutes:02d}:{seconds:02d}' f'.{microseconds:06d}') if seconds: return f'UTC{sign}{hours:02d}:{minutes:02d}:{seconds:02d}' return f'UTC{sign}{hours:02d}:{minutes:02d}'
null
11,718
rarfile
to_datetime
Convert 6-part time tuple into datetime object.
def to_datetime(t): """Convert 6-part time tuple into datetime object. """ # extract values year, mon, day, h, m, s = t # assume the values are valid try: return datetime(year, mon, day, h, m, s) except ValueError: pass # sanitize invalid values mday = (0, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31) mon = max(1, min(mon, 12)) day = max(1, min(day, mday[mon])) h = min(h, 23) m = min(m, 59) s = min(s, 59) return datetime(year, mon, day, h, m, s)
(t)
11,719
rarfile
to_nsdatetime
Apply nanoseconds to datetime.
def to_nsdatetime(dt, nsec): """Apply nanoseconds to datetime. """ if not nsec: return dt return nsdatetime(dt.year, dt.month, dt.day, dt.hour, dt.minute, dt.second, tzinfo=dt.tzinfo, fold=dt.fold, nanosecond=nsec)
(dt, nsec)
11,720
rarfile
to_nsecs
Convert datatime instance to nanoseconds.
def to_nsecs(dt): """Convert datatime instance to nanoseconds. """ secs = int(dt.timestamp()) nsecs = dt.nanosecond if isinstance(dt, nsdatetime) else dt.microsecond * 1000 return secs * 1000000000 + nsecs
(dt)
11,721
rarfile
tool_setup
Pick a tool, return cached ToolSetup.
def tool_setup(unrar=True, unar=True, bsdtar=True, sevenzip=True, sevenzip2=True, force=False): """Pick a tool, return cached ToolSetup. """ global CURRENT_SETUP if force: CURRENT_SETUP = None if CURRENT_SETUP is not None: return CURRENT_SETUP lst = [] if unrar: lst.append(UNRAR_CONFIG) if unar: lst.append(UNAR_CONFIG) if sevenzip: lst.append(SEVENZIP_CONFIG) if sevenzip2: lst.append(SEVENZIP2_CONFIG) if bsdtar: lst.append(BSDTAR_CONFIG) for conf in lst: setup = ToolSetup(conf) if setup.check(): CURRENT_SETUP = setup break if CURRENT_SETUP is None: raise RarCannotExec("Cannot find working tool") return CURRENT_SETUP
(unrar=True, unar=True, bsdtar=True, sevenzip=True, sevenzip2=True, force=False)
11,727
parse
SingleEntry
SingleEntry(package: str, name: str, docstring: Optional[str], code: Optional[str], signature: Optional[str])
class SingleEntry: # name of the pip package package: str # name of the class or method name: str # docstring of the class or method docstring: Optional[str] # code of the class or method code: Optional[str] # signature of the class or method signature: Optional[str]
(package: str, name: str, docstring: Optional[str], code: Optional[str], signature: Optional[str]) -> None
11,728
parse
__eq__
null
import subprocess import argparse from dataclasses import dataclass from typing import Optional, Any import sys import os import inspect_sphinx import inspect from dill.source import getsource import pathlib import pandas as pd import importlib @dataclass class SingleEntry: # name of the pip package package: str # name of the class or method name: str # docstring of the class or method docstring: Optional[str] # code of the class or method code: Optional[str] # signature of the class or method signature: Optional[str]
(self, other)
11,731
parse
_recursive_package_list_modules
null
def _recursive_package_list_modules(package: object): # get all classes from a package for name, obj in inspect.getmembers(package): if inspect.isfunction(obj): yield obj if inspect.isclass(obj): yield obj for name, obj in inspect.getmembers(obj): if inspect.isfunction(obj): yield obj elif inspect.ismodule(obj): try: yield from _recursive_package_list_modules(obj.__name__) except: continue
(package: object)
11,734
parse
get_all_docstrings_and_code
null
def get_all_docstrings_and_code(package_name: str) -> list[SingleEntry]: # get all classes from a package package = importlib.import_module(package_name) all_classes = _recursive_package_list_modules(package) # get docstrings and code of all classes return [get_docs_and_code(cls) for cls in all_classes]
(package_name: str) -> list[parse.SingleEntry]
11,735
parse
get_docs_and_code
null
def get_docs_and_code(cls_or_method: Any) -> SingleEntry: try: doc = inspect_sphinx.getdoc(cls_or_method) except: doc = None try: code = getsource( cls_or_method, builtin=True, force=True, lstrip=True, enclosing=True ) except: code = None try: signature = str(inspect_sphinx.signature(cls_or_method)) except: signature = None return SingleEntry( package=cls_or_method.__module__, docstring=doc, code=code, signature=signature, name=cls_or_method.__name__, )
(cls_or_method: Any) -> parse.SingleEntry
11,736
dill.source
getsource
Return the text of the source code for an object. The source code for interactively-defined objects are extracted from the interpreter's history. The argument may be a module, class, method, function, traceback, frame, or code object. The source code is returned as a single string. An IOError is raised if the source code cannot be retrieved, while a TypeError is raised for objects where the source code is unavailable (e.g. builtins). If alias is provided, then add a line of code that renames the object. If lstrip=True, ensure there is no indentation in the first line of code. If enclosing=True, then also return any enclosing code. If force=True, catch (TypeError,IOError) and try to use import hooks. If builtin=True, force an import for any builtins
def getsource(object, alias='', lstrip=False, enclosing=False, \ force=False, builtin=False): """Return the text of the source code for an object. The source code for interactively-defined objects are extracted from the interpreter's history. The argument may be a module, class, method, function, traceback, frame, or code object. The source code is returned as a single string. An IOError is raised if the source code cannot be retrieved, while a TypeError is raised for objects where the source code is unavailable (e.g. builtins). If alias is provided, then add a line of code that renames the object. If lstrip=True, ensure there is no indentation in the first line of code. If enclosing=True, then also return any enclosing code. If force=True, catch (TypeError,IOError) and try to use import hooks. If builtin=True, force an import for any builtins """ # hascode denotes a callable hascode = _hascode(object) # is a class instance type (and not in builtins) instance = _isinstance(object) # get source lines; if fail, try to 'force' an import try: # fails for builtins, and other assorted object types lines, lnum = getsourcelines(object, enclosing=enclosing) except (TypeError, IOError): # failed to get source, resort to import hooks if not force: # don't try to get types that findsource can't get raise if not getmodule(object): # get things like 'None' and '1' if not instance: return getimport(object, alias, builtin=builtin) # special handling (numpy arrays, ...) _import = getimport(object, builtin=builtin) name = getname(object, force=True) _alias = "%s = " % alias if alias else "" if alias == name: _alias = "" return _import+_alias+"%s\n" % name else: #FIXME: could use a good bit of cleanup, since using getimport... if not instance: return getimport(object, alias, builtin=builtin) # now we are dealing with an instance... name = object.__class__.__name__ module = object.__module__ if module in ['builtins','__builtin__']: return getimport(object, alias, builtin=builtin) else: #FIXME: leverage getimport? use 'from module import name'? lines, lnum = ["%s = __import__('%s', fromlist=['%s']).%s\n" % (name,module,name,name)], 0 obj = eval(lines[0].lstrip(name + ' = ')) lines, lnum = getsourcelines(obj, enclosing=enclosing) # strip leading indent (helps ensure can be imported) if lstrip or alias: lines = _outdent(lines) # instantiate, if there's a nice repr #XXX: BAD IDEA??? if instance: #and force: #XXX: move into findsource or getsourcelines ? if '(' in repr(object): lines.append('%r\n' % object) #else: #XXX: better to somehow to leverage __reduce__ ? # reconstructor,args = object.__reduce__() # _ = reconstructor(*args) else: # fall back to serialization #XXX: bad idea? #XXX: better not duplicate work? #XXX: better new/enclose=True? lines = dumpsource(object, alias='', new=force, enclose=False) lines, lnum = [line+'\n' for line in lines.split('\n')][:-1], 0 #else: object.__code__ # raise AttributeError # add an alias to the source code if alias: if hascode: skip = 0 for line in lines: # skip lines that are decorators if not line.startswith('@'): break skip += 1 #XXX: use regex from findsource / getsourcelines ? if lines[skip].lstrip().startswith('def '): # we have a function if alias != object.__name__: lines.append('\n%s = %s\n' % (alias, object.__name__)) elif 'lambda ' in lines[skip]: # we have a lambda if alias != lines[skip].split('=')[0].strip(): lines[skip] = '%s = %s' % (alias, lines[skip]) else: # ...try to use the object's name if alias != object.__name__: lines.append('\n%s = %s\n' % (alias, object.__name__)) else: # class or class instance if instance: if alias != lines[-1].split('=')[0].strip(): lines[-1] = ('%s = ' % alias) + lines[-1] else: name = getname(object, force=True) or object.__name__ if alias != name: lines.append('\n%s = %s\n' % (alias, name)) return ''.join(lines)
(object, alias='', lstrip=False, enclosing=False, force=False, builtin=False)
11,741
parse
package_to_s3parquet
gets all docstrings and code of a package and saves it to a parquet file pushes the parquet file to s3
def package_to_s3parquet(package_name: str, s3_bucket: Optional[str]): """ gets all docstrings and code of a package and saves it to a parquet file pushes the parquet file to s3 """ os.environ["VIRTUAL_ENV"] = sys.prefix subprocess.check_call(["uv", "pip", "install", package_name]) # get all docstrings and code of the package print(f"installing package {package_name} done.") package_name = package_name.replace("-", "_") entries = get_all_docstrings_and_code(package_name) # convert to a pandas dataframe df = pd.DataFrame([entry.__dict__ for entry in entries]) path = s3_bucket + f"/{package_name}.parquet" # save to parquet df.to_parquet(path) print(f"Exported parquet to {path}")
(package_name: str, s3_bucket: Optional[str])
11,746
parse
test_get_all_docstrings_and_code
null
def test_get_all_docstrings_and_code(library_name: str = "transformers"): # get all docstrings and code of the transformers package entries = get_all_docstrings_and_code("transformers") # print some stats print(f"Number of classes: {len(entries)}") print( f"Number of classes with docstrings: {len([entry for entry in entries if entry.docstring])}" ) print( f"Number of classes with code: {len([entry for entry in entries if entry.code])}" ) print( f"Number of classes with signature: {len([entry for entry in entries if entry.signature])}" ) # check that all classes have a package name starting with 'transformers' assert all([entry.package.startswith("transformers") for entry in entries])
(library_name: str = 'transformers')
11,747
parse
test_get_docs_and_code
null
def test_get_docs_and_code(): # get docstring and code of the class from transformers import LlamaPreTrainedModel entry = get_docs_and_code(LlamaPreTrainedModel) assert entry.package.startswith("transformers") assert entry.docstring == LlamaPreTrainedModel.__doc__
()
11,749
allan_variance.avar
compute_avar
Compute Allan variance (AVAR). Consider an underlying measurement y(t). Our sensors output integrals of y(t) over successive time intervals of length dt. These measurements x(k * dt) form the input to this function. Allan variance is defined for different averaging times tau = m * dt as follows:: AVAR(tau) = 1/2 * <(Y(k + m) - Y(k))>, where Y(j) is the time average value of y(t) over [k * dt, (k + m) * dt] (call it a cluster), and < ... > means averaging over different clusters. If we define X(j) being an integral of x(s) from 0 to dt * j, we can rewrite the AVAR as follows:: AVAR(tau) = 1/(2 * tau**2) * <X(k + 2 * m) - 2 * X(k + m) + X(k)> We implement < ... > by averaging over different clusters of a given sample with overlapping, and X(j) is readily available from x. Parameters ---------- x : ndarray, shape (n, ...) Sensor readings, interpretation depends on `input_type` argument. Assumed to vary along the 0-th axis. dt : float, optional Sampling period. Default is 1. tau_min : float or None, optional Minimum averaging time to use. If None (default), it is assigned equal to `dt`. tau_max : float or None, optional Maximum averaging time to use. If None (default), it is chosen automatically such that to averaging is done over 10 *independent* clusters. n_clusters : int, optional Number of clusters to compute Allan variance for. The averaging times will be spread approximately uniform in a log scale. Default is 100. input_type : {'mean', 'increment', 'integral'}, optional How to interpret the input data: - 'mean': `x` is assumed to contain mean values of y over successive time intervals. - 'increment': `x` is assumed to contain integral increments over successive time intervals. - 'integral': `x` is assumed to contain cumulative integral of y from the time start. Note that 'mean' can be used when passing filtered and sampled value of y. But in this case generally the filtering process affects properties of the underlying signal y and "effective" Allan variance is computed, i.e. you can use obtained parameters when working with the sampled signal. Returns ------- tau : ndarray Averaging times for which Allan variance was computed, 1-d array. avar : ndarray Values of AVAR. The 0-th dimension is the same as for `tau`. The trailing dimensions match ones for `x`. References ---------- .. [1] https://en.wikipedia.org/wiki/Allan_variance
def compute_avar(x, dt=1, tau_min=None, tau_max=None, n_clusters=100, input_type='mean'): """Compute Allan variance (AVAR). Consider an underlying measurement y(t). Our sensors output integrals of y(t) over successive time intervals of length dt. These measurements x(k * dt) form the input to this function. Allan variance is defined for different averaging times tau = m * dt as follows:: AVAR(tau) = 1/2 * <(Y(k + m) - Y(k))>, where Y(j) is the time average value of y(t) over [k * dt, (k + m) * dt] (call it a cluster), and < ... > means averaging over different clusters. If we define X(j) being an integral of x(s) from 0 to dt * j, we can rewrite the AVAR as follows:: AVAR(tau) = 1/(2 * tau**2) * <X(k + 2 * m) - 2 * X(k + m) + X(k)> We implement < ... > by averaging over different clusters of a given sample with overlapping, and X(j) is readily available from x. Parameters ---------- x : ndarray, shape (n, ...) Sensor readings, interpretation depends on `input_type` argument. Assumed to vary along the 0-th axis. dt : float, optional Sampling period. Default is 1. tau_min : float or None, optional Minimum averaging time to use. If None (default), it is assigned equal to `dt`. tau_max : float or None, optional Maximum averaging time to use. If None (default), it is chosen automatically such that to averaging is done over 10 *independent* clusters. n_clusters : int, optional Number of clusters to compute Allan variance for. The averaging times will be spread approximately uniform in a log scale. Default is 100. input_type : {'mean', 'increment', 'integral'}, optional How to interpret the input data: - 'mean': `x` is assumed to contain mean values of y over successive time intervals. - 'increment': `x` is assumed to contain integral increments over successive time intervals. - 'integral': `x` is assumed to contain cumulative integral of y from the time start. Note that 'mean' can be used when passing filtered and sampled value of y. But in this case generally the filtering process affects properties of the underlying signal y and "effective" Allan variance is computed, i.e. you can use obtained parameters when working with the sampled signal. Returns ------- tau : ndarray Averaging times for which Allan variance was computed, 1-d array. avar : ndarray Values of AVAR. The 0-th dimension is the same as for `tau`. The trailing dimensions match ones for `x`. References ---------- .. [1] https://en.wikipedia.org/wiki/Allan_variance """ ALLOWED_INPUT_TYPES = ['mean', 'increment', 'integral'] if input_type not in ALLOWED_INPUT_TYPES: raise ValueError("`input_type` must be one of {}." .format(ALLOWED_INPUT_TYPES)) x = np.asarray(x, dtype=float) if input_type == 'integral': X = x else: X = np.cumsum(x, axis=0) cluster_sizes = _compute_cluster_sizes(len(x), dt, tau_min, tau_max, n_clusters) avar = np.empty(cluster_sizes.shape + X.shape[1:]) for i, k in enumerate(cluster_sizes): c = X[2*k:] - 2 * X[k:-k] + X[:-2*k] avar[i] = np.mean(c**2, axis=0) / k / k if input_type == 'mean': avar *= 0.5 else: avar *= 0.5 / dt**2 return cluster_sizes * dt, avar
(x, dt=1, tau_min=None, tau_max=None, n_clusters=100, input_type='mean')
11,750
allan_variance.avar
estimate_parameters
Estimate noise parameters from Allan variance. The parameters being estimated are typical for inertial sensors: quantization noise, additive white noise, flicker noise (long term bias instability), random walk and linear ramp (this is a deterministic effect). The parameters are estimated using linear least squares with weights inversely proportional to the values of Allan variance. That is the sum of relative error is minimized. This approach is approximately equivalent of doing estimation in the log-log scale. If the variable of the underlying process has a unit U (say deg/s for gyroscopes) then the estimated parameters has the following units: - U * s for quantization noise (deg for gyros) - U * s**0.5 for white noise (deg/s**0.5 for gyros) - U for flicker noise (deg/s for gyros) - U / s**0.5 for random walk (deg/s/s**0.5 for gyros) - U / s for ramp (deg/s/s for gyros) Parameters ---------- tau : ndarray, shape (n,) Values of averaging time. avar : ndarray, shape (n,) or (n, m) Values of Allan variance corresponding to `tau`. effects : list or None, optional Which effects to estimate. Allowed effects are 'quantization', 'white', 'flicker', 'walk', 'ramp'. If None (default), estimate all of the mentioned above effects. sensor_names : list or None, optional How to name sensors in the output. If None (default), use integer values as names. Returns ------- params : pandas DataFrame or Series Estimated parameters. prediction : ndarray, shape (n,) or (n, m) Predicted values of Allan variance using the estimated parameters.
def estimate_parameters(tau, avar, effects=None, sensor_names=None): """Estimate noise parameters from Allan variance. The parameters being estimated are typical for inertial sensors: quantization noise, additive white noise, flicker noise (long term bias instability), random walk and linear ramp (this is a deterministic effect). The parameters are estimated using linear least squares with weights inversely proportional to the values of Allan variance. That is the sum of relative error is minimized. This approach is approximately equivalent of doing estimation in the log-log scale. If the variable of the underlying process has a unit U (say deg/s for gyroscopes) then the estimated parameters has the following units: - U * s for quantization noise (deg for gyros) - U * s**0.5 for white noise (deg/s**0.5 for gyros) - U for flicker noise (deg/s for gyros) - U / s**0.5 for random walk (deg/s/s**0.5 for gyros) - U / s for ramp (deg/s/s for gyros) Parameters ---------- tau : ndarray, shape (n,) Values of averaging time. avar : ndarray, shape (n,) or (n, m) Values of Allan variance corresponding to `tau`. effects : list or None, optional Which effects to estimate. Allowed effects are 'quantization', 'white', 'flicker', 'walk', 'ramp'. If None (default), estimate all of the mentioned above effects. sensor_names : list or None, optional How to name sensors in the output. If None (default), use integer values as names. Returns ------- params : pandas DataFrame or Series Estimated parameters. prediction : ndarray, shape (n,) or (n, m) Predicted values of Allan variance using the estimated parameters. """ ALLOWED_EFFECTS = ['quantization', 'white', 'flicker', 'walk', 'ramp'] avar = np.asarray(avar) single_series = avar.ndim == 1 if single_series: avar = avar[:, None] if effects is None: effects = ALLOWED_EFFECTS elif not set(effects) <= set(ALLOWED_EFFECTS): raise ValueError("Unknown effects are passed.") n = len(tau) A = np.empty((n, 5)) A[:, 0] = 3 / tau**2 A[:, 1] = 1 / tau A[:, 2] = 2 * np.log(2) / np.pi A[:, 3] = tau / 3 A[:, 4] = tau**2 / 2 mask = ['quantization' in effects, 'white' in effects, 'flicker' in effects, 'walk' in effects, 'ramp' in effects] A = A[:, mask] effects = np.asarray(ALLOWED_EFFECTS)[mask] params = [] prediction = [] for column in range(avar.shape[1]): avar_single = avar[:, column] A_scaled = A / avar_single[:, None] x = nnls(A_scaled, np.ones(n))[0] prediction.append(A_scaled.dot(x) * avar_single) params.append(np.sqrt(x)) params = np.asarray(params) prediction = np.asarray(prediction).T params = pd.DataFrame(params, index=sensor_names, columns=effects) if single_series: params = params.iloc[0] prediction = prediction[:, 0] return params, prediction
(tau, avar, effects=None, sensor_names=None)
11,751
allan_variance.noise
generate_noise
Generate Gaussian noise with a power law spectrum. The function generates a noise sequence with the following power spectral density (PSD) function:: PSD(f) = magnitude**2 / (2 * np.pi * f)**alpha Where ``magnitude`` and ``alpha`` are function arguments. The noise is generated using a "fractional differencing" approach [1]_ which does digital convolution of a white noise sequence with an impulse response corresponding to the following transfer function:: H(z) = 1 / (1 - z**-1)**(alpha/2) With ``alpha = 2`` this transfer function is rational and corresponds to a well known process with independent increments or Browninan motion. For other alphas (for example ``alpha = 1`` -- flicker noise) it is non-rational and has to be expanded into a Taylor series to implement a convolution. The size of expansion is equal to the size of the desired output. Parameters ---------- alpha : float Exponent in the denominator of the power law. shape : array_like Required shape of the output. The time is assumed to vary along the 0-th dimension. For example, to generate 100 signals of length 1000 pass `shape` as (1000, 100). dt : float, optional Time period between samples in seconds. Default is 1.0. scale : float, optional Input unit white noise is multiplied by `scale`. Default is 1.0. rng : None, int or `numpy.random.RandomState`, optional Seed to create or already created RandomState. None (default) corresponds to nondeterministic seeding. Returns ------- ndarray Generated noise with the given `shape`. References ---------- .. [1] J. N. Kasdin "Discrete Simulation of Colored Noise and Stochastic Processes and 1/f^alpha Power Law Noise Generation"
def generate_noise(alpha, shape, dt=1.0, scale=1.0, rng=None): """Generate Gaussian noise with a power law spectrum. The function generates a noise sequence with the following power spectral density (PSD) function:: PSD(f) = magnitude**2 / (2 * np.pi * f)**alpha Where ``magnitude`` and ``alpha`` are function arguments. The noise is generated using a "fractional differencing" approach [1]_ which does digital convolution of a white noise sequence with an impulse response corresponding to the following transfer function:: H(z) = 1 / (1 - z**-1)**(alpha/2) With ``alpha = 2`` this transfer function is rational and corresponds to a well known process with independent increments or Browninan motion. For other alphas (for example ``alpha = 1`` -- flicker noise) it is non-rational and has to be expanded into a Taylor series to implement a convolution. The size of expansion is equal to the size of the desired output. Parameters ---------- alpha : float Exponent in the denominator of the power law. shape : array_like Required shape of the output. The time is assumed to vary along the 0-th dimension. For example, to generate 100 signals of length 1000 pass `shape` as (1000, 100). dt : float, optional Time period between samples in seconds. Default is 1.0. scale : float, optional Input unit white noise is multiplied by `scale`. Default is 1.0. rng : None, int or `numpy.random.RandomState`, optional Seed to create or already created RandomState. None (default) corresponds to nondeterministic seeding. Returns ------- ndarray Generated noise with the given `shape`. References ---------- .. [1] J. N. Kasdin "Discrete Simulation of Colored Noise and Stochastic Processes and 1/f^alpha Power Law Noise Generation" """ rng = check_random_state(rng) shape = np.atleast_1d(shape) n_samples = shape[0] h = np.empty(n_samples) h[0] = 1 for i in range(1, n_samples): h[i] = h[i - 1] * (0.5 * alpha + i - 1) / i h = h.reshape((-1,) + (1,) * (len(shape) - 1)) w = rng.randn(*shape) return scale * dt ** (-0.5 * (1 - alpha)) * fftconvolve(h, w)[:n_samples]
(alpha, shape, dt=1.0, scale=1.0, rng=None)
11,753
hstspreload
_crc8
null
def _crc8(value: bytes) -> int: # CRC8 reference implementation: https://github.com/niccokunzmann/crc8 checksum = 0x00 for byte in value: checksum = _CRC8_TABLE[checksum ^ byte] return checksum
(value: bytes) -> int