PEP 706 – Filter for tarfile.extractall
- Petr Viktorin <encukou at gmail.com>
- Discourse thread
- Standards Track
- 25-Jan-2023, 15-Feb-2023
- Discourse message
Table of Contents
- Backwards Compatibility
- Backporting & Forward Compatibility
- Security Implications
- How to Teach This
- Reference Implementation
- Rejected Ideas
- Possible Further Work
The extraction methods in
tarfile gain a
which allows rejecting files or modifying metadata as the archive is extracted.
Three built-in named filters are provided, aimed at limiting features that
might be surprising or dangerous.
These can be used as-is, or serve as a base for custom filters.
After a deprecation period, a strict (but safer) filter will become the default.
tar format is used for several use cases, many of which have different
needs. For example:
- A backup of a UNIX workstation should faithfully preserve all kinds of details like file permissions, symlinks to system configuration, and various kinds of special files.
- When unpacking a data bundle, it’s much more important that the unpacking will not have unintended consequences – like exposing a password file by symlinking it to a public place.
To support all its use cases, the
tar format has many features.
In many cases, it’s best to ignore or disallow some of them when extracting
Python allows extracting
tar archives using
tarfile.TarFile.extractall(), whose docs warn to
never extract archives from untrusted sources without prior inspection.
However, it’s not clear what kind of inspection should be done.
Indeed, it’s quite tricky to do such an inspection correctly.
As a result, many people don’t bother, or do the check incorrectly, resulting in
security issues such as CVE-2007-4559.
tarfile was first written, it’s become more
accepted that warnings in documentation are not enough.
Whenever possible, an unsafe operation should be explicitly requested;
potentially dangerous operations should look dangerous.
TarFile.extractall looks benign in a code review.
Tarfile extraction is also exposed via
which allows the user to not care about the kind of archive they’re
The API is very inviting for extracting archives without prior inspection,
even though the docs again warn against it.
It has been argued that Python is not wrong – it behaves exactly as documented – but that’s beside the point. Let’s improve the situation rather than assign/avoid blame. Python and its docs are the best place to improve things.
How do we improve things?
Unfortunately, we will need to change the defaults, which implies
breaking backwards compatibility.
is what people reach for when they need to extract a tarball.
Its default behaviour needs to change.
What would be the best behaviour? That depends on the use case. So, we’ll add several general “policies” to control extraction. They are based on use cases, and ideally they should have straightforward security implications:
- Current behavior: trusting the archive. Suitable e.g. as a building block for libraries that do the check themselves, or extracting an archive you just made yourself.
- Unpacking a UNIX archive: roughly following GNU
tar, e.g. stripping leading
- Unpacking a general data archive: the
shutil.unpack_archive()use case, where it’s not important to preserve details specific to
taror Unix-like filesystems.
After a deprecation period, the last option – the most limited but most secure one – will become the default.
Even with better general defaults, users should still verify the archives they extract, and perhaps modify some of the metadata. Superficially, the following looks like a reasonable way to do this today:
- Verify or modify each member’s
- Pass the result to
However, there are some issues with this approach:
- It’s possible to modify
TarInfoobjects, but the changes to them affect all subsequent operations on the same
TarFileobject. This behavior is fine for most uses, but despite that, it would be very surprising if
TarFile.extractalldid this by default.
getmemberscan be expensive and it requires a seekable archive.
- When verifying members in advance, it may be necessary to track how each member would have changed the filesystem, e.g. how symlinks are being set up. This is hard. We can’t expect users to do it.
To solve these issues we’ll:
- Provide a supported way to “clone” and modify
replacemethod, similar to
namedtuple._replaceshould do the trick.
- Provide a “filter” hook in
extractall’s loop that can modify or discard members before they are processed.
- Require that this hook is called just before extracting each member, so it can scan the current state of the disk. This will greatly simplify the implementation of policies (both in stdlib and user code), at the cost of not being able to do a precise “dry run”.
The hook API will be very similar to the existing
We’ll also name it
(In some cases “policy” would be a more fitting name,
but the API can be used for more than security policies.)
The built-in policies/filters described above will be implemented using the public filter API, so they can be used as building blocks or examples.
Setting a precedent
If and when other libraries for archive extraction, such as
gain similar functionality, they should mimic this API as much as it’s
To enable this for simple cases, the built-in filters will have string names;
e.g. users can pass
filter='data' instead of a specific function that deals
shutil.unpack_archive() function will get a
filter argument, which it will pass to
Adding function-based API that would work across archive formats is out of scope of this PEP.
Full disclosure & redistributor info
The PEP author works for Red Hat, a redistributor of Python with different security needs and support periods than CPython in general. Such redistributors may want to carry vendor patches to:
- Allow configuring the defaults system-wide, and
- Change the default as soon as possible, even in older Python versions.
The proposal makes this easy to do, and it allows users to query the settings.
Modifying and forgetting member metadata
TarInfo class will gain a new method,
replace(), which will work similarly to
It will return a copy of the
TarInfo object with attributes
replaced as specified by keyword-only arguments:
Any of these, except
linkname, will be allowed to be set
extractall encounters such a
None, it will not
set that piece of metadata.
None, it will fall back to
as if the name wasn’t found.)
tobuf encounters such a
None, it will raise a
list encounters such a
None, it will print a placeholder string.
The documentation will mention why the method is there:
TarInfo objects retrieved from
are “live”; modifying them directly will affect subsequent unrelated
will grow a
filter keyword-only parameter,
which takes a callable that can be called as:
filter(/, member: TarInfo, path: str) -> TarInfo|None
member is the member to be extracted, and
path is the path to
where the archive is extracted (i.e., it’ll be the same for every member).
When used it will be called on each member as it is extracted,
and extraction will work with the result.
If it returns
None, the member will be skipped.
The function can also raise an exception.
This can, depending on
abort the extraction or cause the member to be skipped.
If extraction is aborted, the archive may be left partially extracted. It is the user’s responsibility to clean up.
We will also provide a set of defaults for common use cases.
In addition to a function, the
filter argument can be one
of the following strings:
'fully_trusted': Current behavior: honor the metadata as is. Should be used if the user trusts the archive completely, or implements their own complex verification.
'tar': Roughly follow defaults of the GNU
tarcommand (when run as a normal user):
- Strip leading
- Refuse to extract files with absolute paths (after the
/stripping above, e.g.
- Refuse to extract files whose absolute path (after following symlinks)
would end up outside the destination.
(Note that GNU
tarinstead delays creating some links.)
- Clear high mode bits (setuid, setgid, sticky) and group/other write bits
S_IWGRP|S_IWOTH). (This is an approximation of GNU
tar’s default, which limits the mode by the current
- Strip leading
'data': Extract a “data” archive, disallowing common attack vectors but limiting functionality. In particular, many features specific to UNIX-style filesystems (or equivalently, to the
tararchive format) are ignored, making this a good filter for cross-platform archives. In addition to
- Refuse to extract links (hard or soft) that link to absolute paths.
- Refuse to extract links (hard or soft) which end up linking to a path
outside of the destination.
(On systems that don’t support links,
tarfilewill, in most cases, fall back to creating regular files. This proposal doesn’t change that behaviour.)
- Refuse to extract device files (including pipes).
- For regular files and hard links:
- For other files (directories), ignore mode entirely (set it to
- Ignore user and group info (set
Any other string will cause a
The corresponding filter functions will be available as
tarfile.tar_filter(), etc., so
they can be easily used in custom policies.
Note that these filters never return
Skipping members this way is a feature for user-defined filters.
Defaults and their configuration
TarFile will gain a new attribute,
extraction_filter, to allow configuring the default filter.
By default it will be
None, but users can set it to a callable
that will be used if the
filter argument is missing or
String names won’t be accepted here. That would encourage code like
my_tarfile.extraction_filter = 'data'.
On Python versions without this feature, this would do nothing,
silently ignoring a security-related request.
If both the argument and attribute are
- In Python 3.12-3.13, a
DeprecationWarningwill be emitted and extraction will use the
- In Python 3.14+, it will use the
Applications and system integrators may wish to change
TarFile class itself to set a global default.
When using a function, they will generally want to wrap it in
to prevent injection of a
TarFile can also override
A new exception,
FilterError, will be added to the
It’ll have several new subclasses, one for each of the refusal reasons above.
member attribute will contain the relevant
In the lists above, “refusing” to extract a file means that a
will be raised.
As with other extraction errors, if the
is 1 or more, this will abort the extraction; with
errorlevel=0 the error
will be logged and the member will be ignored, but extraction will continue.
extractall() may leave the archive partially extracted;
it is the user’s responsibility to clean up.
Errorlevel, and fatal/non-fatal errors
TarFile has an errorlevel
argument/attribute, which specifies how errors are handled:
errorlevel=0, documentation says that “all errors are ignored when using
extractall()”. The code only ignores non-fatal and fatal errors (see below), so, for example, you still get
TypeErrorif you pass
Noneas the destination path.
errorlevel=1(the default), all non-fatal errors are ignored. (They may be logged to
sys.stderrby setting the debug argument/attribute.) Which errors are non-fatal is not defined in documentation, but code treats
ExtractionErroras such. Specifically, it’s these issues:
- “unable to resolve link inside archive” (raised on systems that do not support symlinks)
- “fifo/special devices not supported by system” (not used for failures if
the system supports these, e.g. for a
- “could not change owner/mode/modification time”
Note that, for example, file name too long or out of disk space don’t qualify. The non-fatal errors are not very likely to appear on a Unix-like system.
errorlevel=2, all errors are raised, including fatal ones. Which errors are fatal is, again, not defined; in practice it’s
A filter refusing to extract a member does not fit neatly into the fatal/non-fatal categories.
- This PEP does not change existing behavior. (Ideas for improvements are welcome in Discourse topic 25970.)
- When a filter refuses to extract a member, the error should not pass silently by default.
To satisfy this,
FilterError will be considered a fatal error, that is,
it’ll be ignored only with
Users that want to ignore
FilterError but not other fatal errors should
create a custom filter function, and call another filter in a
Hints for further verification
Even with the proposed changes,
tarfile will not be
suited for extracting untrusted files without prior inspection.
Among other issues, the proposed policies don’t prevent denial-of-service
Users should do additional checks.
New docs will tell users to consider:
- extracting to a new empty directory,
- using external (e.g. OS-level) limits on disk, memory and CPU usage,
- checking filenames against an allow-list of characters (to filter out control characters, confusables, etc.),
- checking that filenames have expected extensions (discouraging files that execute when you “click on them”, or extension-less files like Windows special device names),
- limiting the number of extracted files, total size of extracted data, and size of individual files,
- checking for files that would be shadowed on case-insensitive filesystems.
Also, the docs will note that:
- tar files commonly contain multiple versions of the same file: later ones are expected to overwrite earlier ones on extraction,
tarfiledoes not protect against issues with “live” data, e.g. an attacker tinkering with the destination directory while extracting (or adding) is going on (see the GNU tar manual for more info).
This list is not comprehensive, but the documentation is a good place to
collect such general tips.
It can be moved into a separate document if grows too long or if it needs to
be consolidated with
(which is out of scope for this proposal).
TarInfo identity, and
With filters that use
TarInfo objects handled
by the extraction machinery will not necessarily be the same objects
as those present in
This may affect
TarInfo subclasses that override methods like
makelink and rely on object identity.
Such code can switch to comparing
offset, the position of the member
header inside the file.
Note that both the overridable methods and
offset are only
documented in source comments.
The CLI (
python -m tarfile) will gain a
that will take the name of one of the provided default filters.
It won’t be possible to specify a custom filter function.
--filter is not given, the CLI will use the default filter
'fully_trusted' with a deprecation warning now, and
Python 3.14 on).
There will be no short option. (
-f would be confusingly similar to
the filename option of GNU
Other archive libraries
If and when other archive libraries, such as
grow similar functionality, their extraction functions should use a
argument that takes, at least, the strings
'fully_trusted' (which should
disable any security precautions) and
'data' (which should avoid features
that might surprise users).
Standardizing a function-based filter API is out of scope of this PEP.
shutil.unpack_archive() will gain a
If it’s given, it will be passed to the underlying extraction function.
Passing it for a
zip archive will fail for now (until
filter argument, if it ever does).
filter is not specified (or left as
None), it won’t be passed
on, so extracting a tarball will use the default filter
'fully_trusted' with a deprecation warning now, and
Python 3.14 on).
Note that some user-defined filters need, for example,
to count extracted members of do post-processing.
This requires a more complex API than a
However, that complex API need not be exposed to
For example, with a hypothetical
StatefulFilter users would write:
with StatefulFilter() as filter_func: my_tar.extract(path, filter=filter_func)
StatefulFilter example will be added to the docs.
The need for stateful filters is a reason against allowing
registration of custom filter names in addition to
With such a mechanism, API for (at least) set-up and tear-down would need
to be set in stone.
The default behavior of
will change, after raising
DeprecationWarning for 2 releases
(shortest deprecation period allowed in Python’s
backwards compatibility policy).
Additionally, code that relies on
object identity may break, see TarInfo identity, and offset.
Backporting & Forward Compatibility
This feature may be backported to older versions of Python.
In CPython, we don’t add warnings to patch releases, so the default
filter should be changed to
'fully_trusted' in backports.
Other than that, all of the changes to
tarfile should be backported, so
hasattr(tarfile, 'data_filter') becomes a reliable check for all
of the new functionality.
Note that CPython’s usual policy is to avoid adding new APIs in security
This feature does not make sense without a new API
TarFile.extraction_filter and the
so we’ll make an exception.
(See Discourse comment 23149/16
Here are examples of code that takes into account that
tarfile may or may
not have the proposed feature.
When copying these snippets, note that setting
will affect subsequent operations.
- Fully trusted archive:
my_tarfile.extraction_filter = (lambda member, path: member) my_tarfile.extractall()
- Use the
'data'filter if available, but revert to Python 3.11 behavior (
'fully_trusted') if this feature is not available:
my_tarfile.extraction_filter = getattr(tarfile, 'data_filter', (lambda member, path: member)) my_tarfile.extractall()
(This is an unsafe operation, so it should be spelled out explicitly, ideally with a comment.)
- Use the
'data'filter; fail if it is not available:
my_tarfile.extraction_filter = tarfile.data_filter my_tarfile.extractall()
- Use the
'data'filter; warn if it is not available:
if hasattr(tarfile, 'data_filter'): my_tarfile.extractall(filter='data') else: # remove this when no longer needed warn_the_user('Extracting may be unsafe; consider updating Python') my_tarfile.extractall()
This proposal improves security, at the expense of backwards compatibility. In particular, it will help users avoid CVE-2007-4559.
How to Teach This
The API, usage notes and tips for further verification will be added to the documentation. These should be usable for users who are familiar with archives in general, but not with the specifics of UNIX filesystems nor the related security issues.
See pull request #102953 on GitHub.
An initial idea from Lars Gustäbel was to provide a separate class that implements security checks (see gh-65308). There are two major issues with this approach:
- The name is misleading. General archive operations can never be made “safe” from all kinds of unwanted behavior, without impacting legitimate use cases.
- It does not solve the problem of unsafe defaults.
However, many of the ideas behind SafeTarFile were reused in this PEP.
Add absolute_path option to tarfile
Issue gh-73974 asks for adding an
absolute_path option to extraction
methods. This would be a minimal change to formally resolve CVE-2007-4559.
It doesn’t go far enough to protect the unaware, nor to empower the diligent
Other names for the
'tar' filter exposes features specific to UNIX-like filesystems,
so it could be named
Feature-wise, tar format and UNIX-like filesystem are essentially
tar is a good name.
Possible Further Work
Adding filters to zipfile and shutil.unpack_archive
shutil.unpack_archive() could gain support
However, this would require research that this PEP’s author can’t promise
for Python 3.12.
zipfile would probably not help security.
Zip is used primarily for cross-platform data bundles, and correspondingly,
are already similar to what a
'data' filter would do.
'fully_trusted' filter, which would newly allow absolute paths and
.. path components, might not be useful for much except
Filters should be useful for use cases other than security, but those
would usually need custom filter functions, and those would need API that works
That is definitely out of scope of this PEP.
If only this PEP is implemented and nothing changes for
the effect for callers of
unpack_archive is that the default
for tar files is changing from
the more appropriate
In the interim period, Python 3.12-3.13 will emit
That’s annoying, but there are several ways to handle it: e.g. add a
filter argument conditionally, set
globally, or ignore/suppress the warning until Python 3.14.
Also, since many calls to
unpack_archive are likely to be unsafe,
there’s hope that the
DeprecationWarning will often turn out to be
a helpful hint to review affected code.
This proposal is based on prior work and discussions by many people, in particular Lars Gustäbel, Gregory P. Smith, Larry Hastings, Joachim Wagner, Jan Matejek, Jakub Wilk, Daniel Garcia, Lumír Balhar, Miro Hrončok, and many others.
This document is placed in the public domain or under the CC0-1.0-Universal license, whichever is more permissive.
Last modified: 2023-09-09 17:39:29 GMT