This applied black to the 20 smallest files in mercurial/:
ls -S1 mercurial/*.py | tail -n20 | xargs black
indygreg | |
durin42 | |
baymax |
hg-reviewers |
This applied black to the 20 smallest files in mercurial/:
ls -S1 mercurial/*.py | tail -n20 | xargs black
Lint Skipped |
Unit Tests Skipped |
The import parentheses collapsing is described here: https://github.com/ambv/black#parentheses. The behavior is not configurable.
contrib/check-commit | ||
---|---|---|
43 ↗ | (On Diff #12059) | I think we'll want to send this chunk as a separate commit, as it is a policy change. I'd be happy to accept that patch today. |
Where do we stand on the intent to mass reformat the code base?
I'm not super thrilled at some of black's decisions (like using double quotes for all strings and merging imports into the same line which leads to excessive code churn later). But the exact style doesn't matter as much as having it be consistent. So I'm willing to go with black if we feel it's the best tool for the job. I'm not sure what the alternatives are these days.
mercurial/mergeutil.py | ||
---|---|---|
18 | I'm surprised by this result. I'd like to think the reformatting tool would be smarter than this. |
I look at the changes and see nitpicks at best. On the one hand, black proved better than any linter that we can already write consistent code. On the other, if black were a linter... I'd switch to flake8, which at least is configurable.
In D5064#78545, @av6 wrote:I look at the changes and see nitpicks at best. On the one hand, black proved better than any linter that we can already write consistent code. On the other, if black were a linter... I'd switch to flake8, which at least is configurable.
The whole point of black is that it is not configurable. Configurable means you still have to argue about style and decide on a configuration.
In D5064#78780, @mjpieters wrote:In D5064#78545, @av6 wrote:I look at the changes and see nitpicks at best. On the one hand, black proved better than any linter that we can already write consistent code. On the other, if black were a linter... I'd switch to flake8, which at least is configurable.
The whole point of black is that it is not configurable. Configurable means you still have to argue about style and decide on a configuration.
Yep. I find I care less about style decisions when the computer is able to make them for me: eg I prefer ' to ", but since black can fix that and I don't have to type the " myself, I don't really care.
Does anyone strongly object, or do we want to embrace black and the formatting changes it implies?
In D5064#78990, @durin42 wrote:Does anyone strongly object, or do we want to embrace black and the formatting changes it implies?
Is there a fix extension configuration (and I guess precommit hook- I've never used it) for this? If so, maybe it should go on the ContributingChanges page.
In D5064#78994, @mharbison72 wrote:In D5064#78990, @durin42 wrote:Does anyone strongly object, or do we want to embrace black and the formatting changes it implies?
Is there a fix extension configuration (and I guess precommit hook- I've never used it) for this? If so, maybe it should go on the ContributingChanges page.
My plan was to put an hg fix configuration in contrib and mention it in ContributingChanges. I'd advise against a hook for this, but if someone else wanted to write one I'd at least drop it in contrib as an example...
Per mailing list thread, I've sent out https://phab.mercurial-scm.org/D5539 to show what yapf would want to do.
There seems to have been no activities on this Diff for the past 3 Months.
By policy, we are automatically moving it out of the need-review state.
Please, move it back to need-review without hesitation if this diff should still be discussed.
:baymax:need-review-idle:
Path | Packages | |||
---|---|---|---|---|
M | contrib/import-checker.py (2 lines) | |||
M | mercurial/cacheutil.py (13 lines) | |||
M | mercurial/diffhelper.py (24 lines) | |||
M | mercurial/dirstateguard.py (47 lines) | |||
M | mercurial/httpconnection.py (51 lines) | |||
M | mercurial/lsprofcalltree.py (47 lines) | |||
M | mercurial/mergeutil.py (18 lines) | |||
M | mercurial/minifileset.py (63 lines) | |||
M | mercurial/node.py (10 lines) | |||
M | mercurial/policy.py (61 lines) | |||
M | mercurial/pushkey.py (37 lines) | |||
M | mercurial/rcutil.py (51 lines) | |||
M | mercurial/rewriteutil.py (13 lines) | |||
M | mercurial/scmposix.py (59 lines) | |||
M | mercurial/scmwindows.py (43 lines) | |||
M | mercurial/stack.py (10 lines) | |||
M | mercurial/state.py (26 lines) | |||
M | mercurial/txnutil.py (18 lines) | |||
A | M | pyproject.toml (3 lines) | ||
M | tests/test-check-code.t (1 line) |
Commit | Parents | Author | Summary | Date |
---|---|---|---|---|
Martijn Pieters | Oct 14 2018, 9:40 AM |
"""Verify a file conforms to the modern import convention rules. | """Verify a file conforms to the modern import convention rules. | ||||
The rules of the modern convention are: | The rules of the modern convention are: | ||||
* Ordering is stdlib followed by local imports. Each group is lexically | * Ordering is stdlib followed by local imports. Each group is lexically | ||||
sorted. | sorted. | ||||
* Importing multiple modules via "import X, Y" is not allowed: use | * Importing multiple modules via "import X, Y" is not allowed: use | ||||
separate import statements. | separate import statements. | ||||
* Importing multiple modules via "from X import ..." is allowed if using | |||||
parenthesis and one entry per line. | |||||
* Only 1 relative import statement per import level ("from .", "from ..") | * Only 1 relative import statement per import level ("from .", "from ..") | ||||
is allowed. | is allowed. | ||||
* Relative imports from higher levels must occur before lower levels. e.g. | * Relative imports from higher levels must occur before lower levels. e.g. | ||||
"from .." must be before "from .". | "from .." must be before "from .". | ||||
* Imports from peer packages should use relative import (e.g. do not | * Imports from peer packages should use relative import (e.g. do not | ||||
"import mercurial.foo" from a "mercurial.*" module). | "import mercurial.foo" from a "mercurial.*" module). | ||||
* Symbols can only be imported from specific modules (see | * Symbols can only be imported from specific modules (see | ||||
`allowsymbolimports`). For other modules, first import the module then | `allowsymbolimports`). For other modules, first import the module then |
# scmutil.py - Mercurial core utility functions | # scmutil.py - Mercurial core utility functions | ||||
# | # | ||||
# Copyright Matt Mackall <mpm@selenic.com> and other | # Copyright Matt Mackall <mpm@selenic.com> and other | ||||
# | # | ||||
# This software may be used and distributed according to the terms of the | # This software may be used and distributed according to the terms of the | ||||
# GNU General Public License version 2 or any later version. | # GNU General Public License version 2 or any later version. | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
from . import repoview | from . import repoview | ||||
def cachetocopy(srcrepo): | def cachetocopy(srcrepo): | ||||
"""return the list of cache file valuable to copy during a clone""" | """return the list of cache file valuable to copy during a clone""" | ||||
# In local clones we're copying all nodes, not just served | # In local clones we're copying all nodes, not just served | ||||
# ones. Therefore copy all branch caches over. | # ones. Therefore copy all branch caches over. | ||||
cachefiles = ['branch2'] | cachefiles = ["branch2"] | ||||
cachefiles += ['branch2-%s' % f for f in repoview.filtertable] | cachefiles += ["branch2-%s" % f for f in repoview.filtertable] | ||||
cachefiles += ['rbc-names-v1', 'rbc-revs-v1'] | cachefiles += ["rbc-names-v1", "rbc-revs-v1"] | ||||
cachefiles += ['tags2'] | cachefiles += ["tags2"] | ||||
cachefiles += ['tags2-%s' % f for f in repoview.filtertable] | cachefiles += ["tags2-%s" % f for f in repoview.filtertable] | ||||
cachefiles += ['hgtagsfnodes1'] | cachefiles += ["hgtagsfnodes1"] | ||||
return cachefiles | return cachefiles |
# diffhelper.py - helper routines for patch | # diffhelper.py - helper routines for patch | ||||
# | # | ||||
# Copyright 2009 Matt Mackall <mpm@selenic.com> and others | # Copyright 2009 Matt Mackall <mpm@selenic.com> and others | ||||
# | # | ||||
# This software may be used and distributed according to the terms of the | # This software may be used and distributed according to the terms of the | ||||
# GNU General Public License version 2 or any later version. | # GNU General Public License version 2 or any later version. | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
from .i18n import _ | from .i18n import _ | ||||
from . import ( | from . import error, pycompat | ||||
error, | |||||
pycompat, | |||||
) | |||||
def addlines(fp, hunk, lena, lenb, a, b): | def addlines(fp, hunk, lena, lenb, a, b): | ||||
"""Read lines from fp into the hunk | """Read lines from fp into the hunk | ||||
The hunk is parsed into two arrays, a and b. a gets the old state of | The hunk is parsed into two arrays, a and b. a gets the old state of | ||||
the text, b gets the new state. The control char from the hunk is saved | the text, b gets the new state. The control char from the hunk is saved | ||||
when inserting into a, but not b (for performance while deleting files.) | when inserting into a, but not b (for performance while deleting files.) | ||||
""" | """ | ||||
while True: | while True: | ||||
todoa = lena - len(a) | todoa = lena - len(a) | ||||
todob = lenb - len(b) | todob = lenb - len(b) | ||||
num = max(todoa, todob) | num = max(todoa, todob) | ||||
if num == 0: | if num == 0: | ||||
break | break | ||||
for i in pycompat.xrange(num): | for i in pycompat.xrange(num): | ||||
s = fp.readline() | s = fp.readline() | ||||
if not s: | if not s: | ||||
raise error.ParseError(_('incomplete hunk')) | raise error.ParseError(_("incomplete hunk")) | ||||
if s == "\\ No newline at end of file\n": | if s == "\\ No newline at end of file\n": | ||||
fixnewline(hunk, a, b) | fixnewline(hunk, a, b) | ||||
continue | continue | ||||
if s == '\n' or s == '\r\n': | if s == "\n" or s == "\r\n": | ||||
# Some patches may be missing the control char | # Some patches may be missing the control char | ||||
# on empty lines. Supply a leading space. | # on empty lines. Supply a leading space. | ||||
s = ' ' + s | s = " " + s | ||||
hunk.append(s) | hunk.append(s) | ||||
if s.startswith('+'): | if s.startswith("+"): | ||||
b.append(s[1:]) | b.append(s[1:]) | ||||
elif s.startswith('-'): | elif s.startswith("-"): | ||||
a.append(s) | a.append(s) | ||||
else: | else: | ||||
b.append(s[1:]) | b.append(s[1:]) | ||||
a.append(s) | a.append(s) | ||||
def fixnewline(hunk, a, b): | def fixnewline(hunk, a, b): | ||||
"""Fix up the last lines of a and b when the patch has no newline at EOF""" | """Fix up the last lines of a and b when the patch has no newline at EOF""" | ||||
l = hunk[-1] | l = hunk[-1] | ||||
# tolerate CRLF in last line | # tolerate CRLF in last line | ||||
if l.endswith('\r\n'): | if l.endswith("\r\n"): | ||||
hline = l[:-2] | hline = l[:-2] | ||||
else: | else: | ||||
hline = l[:-1] | hline = l[:-1] | ||||
if hline.startswith((' ', '+')): | if hline.startswith((" ", "+")): | ||||
b[-1] = hline[1:] | b[-1] = hline[1:] | ||||
if hline.startswith((' ', '-')): | if hline.startswith((" ", "-")): | ||||
a[-1] = hline | a[-1] = hline | ||||
hunk[-1] = hline | hunk[-1] = hline | ||||
def testhunk(a, b, bstart): | def testhunk(a, b, bstart): | ||||
"""Compare the lines in a with the lines in b | """Compare the lines in a with the lines in b | ||||
a is assumed to have a control char at the start of each line, this char | a is assumed to have a control char at the start of each line, this char | ||||
is ignored in the compare. | is ignored in the compare. | ||||
""" | """ | ||||
alen = len(a) | alen = len(a) | ||||
blen = len(b) | blen = len(b) | ||||
if alen > blen - bstart or bstart < 0: | if alen > blen - bstart or bstart < 0: | ||||
return False | return False | ||||
for i in pycompat.xrange(alen): | for i in pycompat.xrange(alen): | ||||
if a[i][1:] != b[i + bstart]: | if a[i][1:] != b[i + bstart]: | ||||
return False | return False | ||||
return True | return True |
# dirstateguard.py - class to allow restoring dirstate after failure | # dirstateguard.py - class to allow restoring dirstate after failure | ||||
# | # | ||||
# Copyright 2005-2007 Matt Mackall <mpm@selenic.com> | # Copyright 2005-2007 Matt Mackall <mpm@selenic.com> | ||||
# | # | ||||
# This software may be used and distributed according to the terms of the | # This software may be used and distributed according to the terms of the | ||||
# GNU General Public License version 2 or any later version. | # GNU General Public License version 2 or any later version. | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
from .i18n import _ | from .i18n import _ | ||||
from . import ( | from . import error, narrowspec, util | ||||
error, | |||||
narrowspec, | |||||
util, | |||||
) | |||||
class dirstateguard(util.transactional): | class dirstateguard(util.transactional): | ||||
'''Restore dirstate at unexpected failure. | """Restore dirstate at unexpected failure. | ||||
At the construction, this class does: | At the construction, this class does: | ||||
- write current ``repo.dirstate`` out, and | - write current ``repo.dirstate`` out, and | ||||
- save ``.hg/dirstate`` into the backup file | - save ``.hg/dirstate`` into the backup file | ||||
This restores ``.hg/dirstate`` from backup file, if ``release()`` | This restores ``.hg/dirstate`` from backup file, if ``release()`` | ||||
is invoked before ``close()``. | is invoked before ``close()``. | ||||
This just removes the backup file at ``close()`` before ``release()``. | This just removes the backup file at ``close()`` before ``release()``. | ||||
''' | """ | ||||
def __init__(self, repo, name): | def __init__(self, repo, name): | ||||
self._repo = repo | self._repo = repo | ||||
self._active = False | self._active = False | ||||
self._closed = False | self._closed = False | ||||
self._backupname = 'dirstate.backup.%s.%d' % (name, id(self)) | self._backupname = "dirstate.backup.%s.%d" % (name, id(self)) | ||||
self._narrowspecbackupname = ('narrowspec.backup.%s.%d' % | self._narrowspecbackupname = "narrowspec.backup.%s.%d" % ( | ||||
(name, id(self))) | name, | ||||
id(self), | |||||
) | |||||
repo.dirstate.savebackup(repo.currenttransaction(), self._backupname) | repo.dirstate.savebackup(repo.currenttransaction(), self._backupname) | ||||
narrowspec.savebackup(repo, self._narrowspecbackupname) | narrowspec.savebackup(repo, self._narrowspecbackupname) | ||||
self._active = True | self._active = True | ||||
def __del__(self): | def __del__(self): | ||||
if self._active: # still active | if self._active: # still active | ||||
# this may occur, even if this class is used correctly: | # this may occur, even if this class is used correctly: | ||||
# for example, releasing other resources like transaction | # for example, releasing other resources like transaction | ||||
# may raise exception before ``dirstateguard.release`` in | # may raise exception before ``dirstateguard.release`` in | ||||
# ``release(tr, ....)``. | # ``release(tr, ....)``. | ||||
self._abort() | self._abort() | ||||
def close(self): | def close(self): | ||||
if not self._active: # already inactivated | if not self._active: # already inactivated | ||||
msg = (_("can't close already inactivated backup: %s") | msg = ( | ||||
% self._backupname) | _("can't close already inactivated backup: %s") | ||||
% self._backupname | |||||
) | |||||
raise error.Abort(msg) | raise error.Abort(msg) | ||||
self._repo.dirstate.clearbackup(self._repo.currenttransaction(), | self._repo.dirstate.clearbackup( | ||||
self._backupname) | self._repo.currenttransaction(), self._backupname | ||||
) | |||||
narrowspec.clearbackup(self._repo, self._narrowspecbackupname) | narrowspec.clearbackup(self._repo, self._narrowspecbackupname) | ||||
self._active = False | self._active = False | ||||
self._closed = True | self._closed = True | ||||
def _abort(self): | def _abort(self): | ||||
narrowspec.restorebackup(self._repo, self._narrowspecbackupname) | narrowspec.restorebackup(self._repo, self._narrowspecbackupname) | ||||
self._repo.dirstate.restorebackup(self._repo.currenttransaction(), | self._repo.dirstate.restorebackup( | ||||
self._backupname) | self._repo.currenttransaction(), self._backupname | ||||
) | |||||
self._active = False | self._active = False | ||||
def release(self): | def release(self): | ||||
if not self._closed: | if not self._closed: | ||||
if not self._active: # already inactivated | if not self._active: # already inactivated | ||||
msg = (_("can't release already inactivated backup: %s") | msg = ( | ||||
% self._backupname) | _("can't release already inactivated backup: %s") | ||||
% self._backupname | |||||
) | |||||
raise error.Abort(msg) | raise error.Abort(msg) | ||||
self._abort() | self._abort() |
# httpconnection.py - urllib2 handler for new http support | # httpconnection.py - urllib2 handler for new http support | ||||
# | # | ||||
# Copyright 2005, 2006, 2007, 2008 Matt Mackall <mpm@selenic.com> | # Copyright 2005, 2006, 2007, 2008 Matt Mackall <mpm@selenic.com> | ||||
# Copyright 2006, 2007 Alexis S. L. Carvalho <alexis@cecm.usp.br> | # Copyright 2006, 2007 Alexis S. L. Carvalho <alexis@cecm.usp.br> | ||||
# Copyright 2006 Vadim Gelfer <vadim.gelfer@gmail.com> | # Copyright 2006 Vadim Gelfer <vadim.gelfer@gmail.com> | ||||
# Copyright 2011 Google, Inc. | # Copyright 2011 Google, Inc. | ||||
# | # | ||||
# This software may be used and distributed according to the terms of the | # This software may be used and distributed according to the terms of the | ||||
# GNU General Public License version 2 or any later version. | # GNU General Public License version 2 or any later version. | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
import os | import os | ||||
from .i18n import _ | from .i18n import _ | ||||
from . import ( | from . import pycompat, util | ||||
pycompat, | |||||
util, | |||||
) | |||||
urlerr = util.urlerr | urlerr = util.urlerr | ||||
urlreq = util.urlreq | urlreq = util.urlreq | ||||
# moved here from url.py to avoid a cycle | # moved here from url.py to avoid a cycle | ||||
class httpsendfile(object): | class httpsendfile(object): | ||||
"""This is a wrapper around the objects returned by python's "open". | """This is a wrapper around the objects returned by python's "open". | ||||
self.write = self._data.write | self.write = self._data.write | ||||
self.length = os.fstat(self._data.fileno()).st_size | self.length = os.fstat(self._data.fileno()).st_size | ||||
self._pos = 0 | self._pos = 0 | ||||
# We pass double the max for total because we currently have | # We pass double the max for total because we currently have | ||||
# to send the bundle twice in the case of a server that | # to send the bundle twice in the case of a server that | ||||
# requires authentication. Since we can't know until we try | # requires authentication. Since we can't know until we try | ||||
# once whether authentication will be required, just lie to | # once whether authentication will be required, just lie to | ||||
# the user and maybe the push succeeds suddenly at 50%. | # the user and maybe the push succeeds suddenly at 50%. | ||||
self._progress = ui.makeprogress(_('sending'), unit=_('kb'), | self._progress = ui.makeprogress( | ||||
total=(self.length // 1024 * 2)) | _("sending"), unit=_("kb"), total=(self.length // 1024 * 2) | ||||
) | |||||
def read(self, *args, **kwargs): | def read(self, *args, **kwargs): | ||||
ret = self._data.read(*args, **kwargs) | ret = self._data.read(*args, **kwargs) | ||||
if not ret: | if not ret: | ||||
self._progress.complete() | self._progress.complete() | ||||
return ret | return ret | ||||
self._pos += len(ret) | self._pos += len(ret) | ||||
self._progress.update(self._pos // 1024) | self._progress.update(self._pos // 1024) | ||||
return ret | return ret | ||||
def __enter__(self): | def __enter__(self): | ||||
return self | return self | ||||
def __exit__(self, exc_type, exc_val, exc_tb): | def __exit__(self, exc_type, exc_val, exc_tb): | ||||
self.close() | self.close() | ||||
# moved here from url.py to avoid a cycle | # moved here from url.py to avoid a cycle | ||||
def readauthforuri(ui, uri, user): | def readauthforuri(ui, uri, user): | ||||
uri = pycompat.bytesurl(uri) | uri = pycompat.bytesurl(uri) | ||||
# Read configuration | # Read configuration | ||||
groups = {} | groups = {} | ||||
for key, val in ui.configitems('auth'): | for key, val in ui.configitems("auth"): | ||||
if key in ('cookiefile',): | if key in ("cookiefile",): | ||||
continue | continue | ||||
if '.' not in key: | if "." not in key: | ||||
ui.warn(_("ignoring invalid [auth] key '%s'\n") % key) | ui.warn(_("ignoring invalid [auth] key '%s'\n") % key) | ||||
continue | continue | ||||
group, setting = key.rsplit('.', 1) | group, setting = key.rsplit(".", 1) | ||||
gdict = groups.setdefault(group, {}) | gdict = groups.setdefault(group, {}) | ||||
if setting in ('username', 'cert', 'key'): | if setting in ("username", "cert", "key"): | ||||
val = util.expandpath(val) | val = util.expandpath(val) | ||||
gdict[setting] = val | gdict[setting] = val | ||||
# Find the best match | # Find the best match | ||||
scheme, hostpath = uri.split('://', 1) | scheme, hostpath = uri.split("://", 1) | ||||
bestuser = None | bestuser = None | ||||
bestlen = 0 | bestlen = 0 | ||||
bestauth = None | bestauth = None | ||||
for group, auth in groups.iteritems(): | for group, auth in groups.iteritems(): | ||||
if user and user != auth.get('username', user): | if user and user != auth.get("username", user): | ||||
# If a username was set in the URI, the entry username | # If a username was set in the URI, the entry username | ||||
# must either match it or be unset | # must either match it or be unset | ||||
continue | continue | ||||
prefix = auth.get('prefix') | prefix = auth.get("prefix") | ||||
if not prefix: | if not prefix: | ||||
continue | continue | ||||
p = prefix.split('://', 1) | p = prefix.split("://", 1) | ||||
if len(p) > 1: | if len(p) > 1: | ||||
schemes, prefix = [p[0]], p[1] | schemes, prefix = [p[0]], p[1] | ||||
else: | else: | ||||
schemes = (auth.get('schemes') or 'https').split() | schemes = (auth.get("schemes") or "https").split() | ||||
if (prefix == '*' or hostpath.startswith(prefix)) and \ | if ( | ||||
(len(prefix) > bestlen or (len(prefix) == bestlen and \ | (prefix == "*" or hostpath.startswith(prefix)) | ||||
not bestuser and 'username' in auth)) \ | and ( | ||||
and scheme in schemes: | len(prefix) > bestlen | ||||
or ( | |||||
len(prefix) == bestlen | |||||
and not bestuser | |||||
and "username" in auth | |||||
) | |||||
) | |||||
and scheme in schemes | |||||
): | |||||
bestlen = len(prefix) | bestlen = len(prefix) | ||||
bestauth = group, auth | bestauth = group, auth | ||||
bestuser = auth.get('username') | bestuser = auth.get("username") | ||||
if user and not bestuser: | if user and not bestuser: | ||||
auth['username'] = user | auth["username"] = user | ||||
return bestauth | return bestauth |
""" | """ | ||||
lsprofcalltree.py - lsprof output which is readable by kcachegrind | lsprofcalltree.py - lsprof output which is readable by kcachegrind | ||||
Authors: | Authors: | ||||
* David Allouche <david <at> allouche.net> | * David Allouche <david <at> allouche.net> | ||||
* Jp Calderone & Itamar Shtull-Trauring | * Jp Calderone & Itamar Shtull-Trauring | ||||
* Johan Dahlin | * Johan Dahlin | ||||
This software may be used and distributed according to the terms | This software may be used and distributed according to the terms | ||||
of the GNU General Public License, incorporated herein by reference. | of the GNU General Public License, incorporated herein by reference. | ||||
""" | """ | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
from . import ( | from . import pycompat | ||||
pycompat, | |||||
) | |||||
def label(code): | def label(code): | ||||
if isinstance(code, str): | if isinstance(code, str): | ||||
# built-in functions ('~' sorts at the end) | # built-in functions ('~' sorts at the end) | ||||
return '~' + pycompat.sysbytes(code) | return "~" + pycompat.sysbytes(code) | ||||
else: | else: | ||||
return '%s %s:%d' % (pycompat.sysbytes(code.co_name), | return "%s %s:%d" % ( | ||||
pycompat.sysbytes(code.co_name), | |||||
pycompat.sysbytes(code.co_filename), | pycompat.sysbytes(code.co_filename), | ||||
code.co_firstlineno) | code.co_firstlineno, | ||||
) | |||||
class KCacheGrind(object): | class KCacheGrind(object): | ||||
def __init__(self, profiler): | def __init__(self, profiler): | ||||
self.data = profiler.getstats() | self.data = profiler.getstats() | ||||
self.out_file = None | self.out_file = None | ||||
def output(self, out_file): | def output(self, out_file): | ||||
self.out_file = out_file | self.out_file = out_file | ||||
out_file.write(b'events: Ticks\n') | out_file.write(b"events: Ticks\n") | ||||
self._print_summary() | self._print_summary() | ||||
for entry in self.data: | for entry in self.data: | ||||
self._entry(entry) | self._entry(entry) | ||||
def _print_summary(self): | def _print_summary(self): | ||||
max_cost = 0 | max_cost = 0 | ||||
for entry in self.data: | for entry in self.data: | ||||
totaltime = int(entry.totaltime * 1000) | totaltime = int(entry.totaltime * 1000) | ||||
max_cost = max(max_cost, totaltime) | max_cost = max(max_cost, totaltime) | ||||
self.out_file.write(b'summary: %d\n' % max_cost) | self.out_file.write(b"summary: %d\n" % max_cost) | ||||
def _entry(self, entry): | def _entry(self, entry): | ||||
out_file = self.out_file | out_file = self.out_file | ||||
code = entry.code | code = entry.code | ||||
if isinstance(code, str): | if isinstance(code, str): | ||||
out_file.write(b'fi=~\n') | out_file.write(b"fi=~\n") | ||||
else: | else: | ||||
out_file.write(b'fi=%s\n' % pycompat.sysbytes(code.co_filename)) | out_file.write(b"fi=%s\n" % pycompat.sysbytes(code.co_filename)) | ||||
out_file.write(b'fn=%s\n' % label(code)) | out_file.write(b"fn=%s\n" % label(code)) | ||||
inlinetime = int(entry.inlinetime * 1000) | inlinetime = int(entry.inlinetime * 1000) | ||||
if isinstance(code, str): | if isinstance(code, str): | ||||
out_file.write(b'0 %d\n' % inlinetime) | out_file.write(b"0 %d\n" % inlinetime) | ||||
else: | else: | ||||
out_file.write(b'%d %d\n' % (code.co_firstlineno, inlinetime)) | out_file.write(b"%d %d\n" % (code.co_firstlineno, inlinetime)) | ||||
# recursive calls are counted in entry.calls | # recursive calls are counted in entry.calls | ||||
if entry.calls: | if entry.calls: | ||||
calls = entry.calls | calls = entry.calls | ||||
else: | else: | ||||
calls = [] | calls = [] | ||||
if isinstance(code, str): | if isinstance(code, str): | ||||
lineno = 0 | lineno = 0 | ||||
else: | else: | ||||
lineno = code.co_firstlineno | lineno = code.co_firstlineno | ||||
for subentry in calls: | for subentry in calls: | ||||
self._subentry(lineno, subentry) | self._subentry(lineno, subentry) | ||||
out_file.write(b'\n') | out_file.write(b"\n") | ||||
def _subentry(self, lineno, subentry): | def _subentry(self, lineno, subentry): | ||||
out_file = self.out_file | out_file = self.out_file | ||||
code = subentry.code | code = subentry.code | ||||
out_file.write(b'cfn=%s\n' % label(code)) | out_file.write(b"cfn=%s\n" % label(code)) | ||||
if isinstance(code, str): | if isinstance(code, str): | ||||
out_file.write(b'cfi=~\n') | out_file.write(b"cfi=~\n") | ||||
out_file.write(b'calls=%d 0\n' % subentry.callcount) | out_file.write(b"calls=%d 0\n" % subentry.callcount) | ||||
else: | else: | ||||
out_file.write(b'cfi=%s\n' % pycompat.sysbytes(code.co_filename)) | out_file.write(b"cfi=%s\n" % pycompat.sysbytes(code.co_filename)) | ||||
out_file.write(b'calls=%d %d\n' % ( | out_file.write( | ||||
subentry.callcount, code.co_firstlineno)) | b"calls=%d %d\n" % (subentry.callcount, code.co_firstlineno) | ||||
) | |||||
totaltime = int(subentry.totaltime * 1000) | totaltime = int(subentry.totaltime * 1000) | ||||
out_file.write(b'%d %d\n' % (lineno, totaltime)) | out_file.write(b"%d %d\n" % (lineno, totaltime)) |
# minifileset.py - a simple language to select files | # minifileset.py - a simple language to select files | ||||
# | # | ||||
# Copyright 2017 Facebook, Inc. | # Copyright 2017 Facebook, Inc. | ||||
# | # | ||||
# This software may be used and distributed according to the terms of the | # This software may be used and distributed according to the terms of the | ||||
# GNU General Public License version 2 or any later version. | # GNU General Public License version 2 or any later version. | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
from .i18n import _ | from .i18n import _ | ||||
from . import ( | from . import error, fileset, filesetlang, pycompat | ||||
error, | |||||
fileset, | |||||
filesetlang, | |||||
pycompat, | |||||
) | |||||
def _sizep(x): | def _sizep(x): | ||||
# i18n: "size" is a keyword | # i18n: "size" is a keyword | ||||
expr = filesetlang.getstring(x, _("size requires an expression")) | expr = filesetlang.getstring(x, _("size requires an expression")) | ||||
return fileset.sizematcher(expr) | return fileset.sizematcher(expr) | ||||
def _compile(tree): | def _compile(tree): | ||||
if not tree: | if not tree: | ||||
raise error.ParseError(_("missing argument")) | raise error.ParseError(_("missing argument")) | ||||
op = tree[0] | op = tree[0] | ||||
if op == 'withstatus': | if op == "withstatus": | ||||
return _compile(tree[1]) | return _compile(tree[1]) | ||||
elif op in {'symbol', 'string', 'kindpat'}: | elif op in {"symbol", "string", "kindpat"}: | ||||
name = filesetlang.getpattern(tree, {'path'}, _('invalid file pattern')) | name = filesetlang.getpattern(tree, {"path"}, _("invalid file pattern")) | ||||
if name.startswith('**'): # file extension test, ex. "**.tar.gz" | if name.startswith("**"): # file extension test, ex. "**.tar.gz" | ||||
ext = name[2:] | ext = name[2:] | ||||
for c in pycompat.bytestr(ext): | for c in pycompat.bytestr(ext): | ||||
if c in '*{}[]?/\\': | if c in "*{}[]?/\\": | ||||
raise error.ParseError(_('reserved character: %s') % c) | raise error.ParseError(_("reserved character: %s") % c) | ||||
return lambda n, s: n.endswith(ext) | return lambda n, s: n.endswith(ext) | ||||
elif name.startswith('path:'): # directory or full path test | elif name.startswith("path:"): # directory or full path test | ||||
p = name[5:] # prefix | p = name[5:] # prefix | ||||
pl = len(p) | pl = len(p) | ||||
f = lambda n, s: n.startswith(p) and (len(n) == pl | f = lambda n, s: n.startswith(p) and ( | ||||
or n[pl:pl + 1] == '/') | len(n) == pl or n[pl : pl + 1] == "/" | ||||
) | |||||
return f | return f | ||||
raise error.ParseError(_("unsupported file pattern: %s") % name, | raise error.ParseError( | ||||
hint=_('paths must be prefixed with "path:"')) | _("unsupported file pattern: %s") % name, | ||||
elif op in {'or', 'patterns'}: | hint=_('paths must be prefixed with "path:"'), | ||||
) | |||||
elif op in {"or", "patterns"}: | |||||
funcs = [_compile(x) for x in tree[1:]] | funcs = [_compile(x) for x in tree[1:]] | ||||
return lambda n, s: any(f(n, s) for f in funcs) | return lambda n, s: any(f(n, s) for f in funcs) | ||||
elif op == 'and': | elif op == "and": | ||||
func1 = _compile(tree[1]) | func1 = _compile(tree[1]) | ||||
func2 = _compile(tree[2]) | func2 = _compile(tree[2]) | ||||
return lambda n, s: func1(n, s) and func2(n, s) | return lambda n, s: func1(n, s) and func2(n, s) | ||||
elif op == 'not': | elif op == "not": | ||||
return lambda n, s: not _compile(tree[1])(n, s) | return lambda n, s: not _compile(tree[1])(n, s) | ||||
elif op == 'func': | elif op == "func": | ||||
symbols = { | symbols = { | ||||
'all': lambda n, s: True, | "all": lambda n, s: True, | ||||
'none': lambda n, s: False, | "none": lambda n, s: False, | ||||
'size': lambda n, s: _sizep(tree[2])(s), | "size": lambda n, s: _sizep(tree[2])(s), | ||||
} | } | ||||
name = filesetlang.getsymbol(tree[1]) | name = filesetlang.getsymbol(tree[1]) | ||||
if name in symbols: | if name in symbols: | ||||
return symbols[name] | return symbols[name] | ||||
raise error.UnknownIdentifier(name, symbols.keys()) | raise error.UnknownIdentifier(name, symbols.keys()) | ||||
elif op == 'minus': # equivalent to 'x and not y' | elif op == "minus": # equivalent to 'x and not y' | ||||
func1 = _compile(tree[1]) | func1 = _compile(tree[1]) | ||||
func2 = _compile(tree[2]) | func2 = _compile(tree[2]) | ||||
return lambda n, s: func1(n, s) and not func2(n, s) | return lambda n, s: func1(n, s) and not func2(n, s) | ||||
elif op == 'list': | elif op == "list": | ||||
raise error.ParseError(_("can't use a list in this context"), | raise error.ParseError( | ||||
hint=_('see \'hg help "filesets.x or y"\'')) | _("can't use a list in this context"), | ||||
raise error.ProgrammingError('illegal tree: %r' % (tree,)) | hint=_("see 'hg help \"filesets.x or y\"'"), | ||||
) | |||||
raise error.ProgrammingError("illegal tree: %r" % (tree,)) | |||||
def compile(text): | def compile(text): | ||||
"""generate a function (path, size) -> bool from filter specification. | """generate a function (path, size) -> bool from filter specification. | ||||
"text" could contain the operators defined by the fileset language for | "text" could contain the operators defined by the fileset language for | ||||
common logic operations, and parenthesis for grouping. The supported path | common logic operations, and parenthesis for grouping. The supported path | ||||
tests are '**.extname' for file extension test, and '"path:dir/subdir"' | tests are '**.extname' for file extension test, and '"path:dir/subdir"' | ||||
for prefix test. The ``size()`` predicate is borrowed from filesets to test | for prefix test. The ``size()`` predicate is borrowed from filesets to test |
# profiles, we can use this version only on Python 3, and forward | # profiles, we can use this version only on Python 3, and forward | ||||
# binascii.unhexlify like we used to on Python 2. | # binascii.unhexlify like we used to on Python 2. | ||||
def bin(s): | def bin(s): | ||||
try: | try: | ||||
return binascii.unhexlify(s) | return binascii.unhexlify(s) | ||||
except binascii.Error as e: | except binascii.Error as e: | ||||
raise TypeError(e) | raise TypeError(e) | ||||
nullrev = -1 | nullrev = -1 | ||||
# In hex, this is '0000000000000000000000000000000000000000' | # In hex, this is '0000000000000000000000000000000000000000' | ||||
nullid = b"\0" * 20 | nullid = b"\0" * 20 | ||||
nullhex = hex(nullid) | nullhex = hex(nullid) | ||||
# Phony node value to stand-in for new files in some uses of | # Phony node value to stand-in for new files in some uses of | ||||
# manifests. | # manifests. | ||||
# In hex, this is '2121212121212121212121212121212121212121' | # In hex, this is '2121212121212121212121212121212121212121' | ||||
newnodeid = '!!!!!!!!!!!!!!!!!!!!' | newnodeid = "!!!!!!!!!!!!!!!!!!!!" | ||||
# In hex, this is '3030303030303030303030303030306164646564' | # In hex, this is '3030303030303030303030303030306164646564' | ||||
addednodeid = '000000000000000added' | addednodeid = "000000000000000added" | ||||
# In hex, this is '3030303030303030303030306d6f646966696564' | # In hex, this is '3030303030303030303030306d6f646966696564' | ||||
modifiednodeid = '000000000000modified' | modifiednodeid = "000000000000modified" | ||||
wdirfilenodeids = {newnodeid, addednodeid, modifiednodeid} | wdirfilenodeids = {newnodeid, addednodeid, modifiednodeid} | ||||
# pseudo identifiers for working directory | # pseudo identifiers for working directory | ||||
# (they are experimental, so don't add too many dependencies on them) | # (they are experimental, so don't add too many dependencies on them) | ||||
wdirrev = 0x7fffffff | wdirrev = 0x7FFFFFFF | ||||
# In hex, this is 'ffffffffffffffffffffffffffffffffffffffff' | # In hex, this is 'ffffffffffffffffffffffffffffffffffffffff' | ||||
wdirid = b"\xff" * 20 | wdirid = b"\xff" * 20 | ||||
wdirhex = hex(wdirid) | wdirhex = hex(wdirid) | ||||
def short(node): | def short(node): | ||||
return hex(node[:6]) | return hex(node[:6]) |
# allow - allow pure Python implementation when C loading fails | # allow - allow pure Python implementation when C loading fails | ||||
# cffi - required cffi versions (implemented within pure module) | # cffi - required cffi versions (implemented within pure module) | ||||
# cffi-allow - allow pure Python implementation if cffi version is missing | # cffi-allow - allow pure Python implementation if cffi version is missing | ||||
# py - only load pure Python modules | # py - only load pure Python modules | ||||
# | # | ||||
# By default, fall back to the pure modules so the in-place build can | # By default, fall back to the pure modules so the in-place build can | ||||
# run without recompiling the C extensions. This will be overridden by | # run without recompiling the C extensions. This will be overridden by | ||||
# __modulepolicy__ generated by setup.py. | # __modulepolicy__ generated by setup.py. | ||||
policy = b'allow' | policy = b"allow" | ||||
_packageprefs = { | _packageprefs = { | ||||
# policy: (versioned package, pure package) | # policy: (versioned package, pure package) | ||||
b'c': (r'cext', None), | b"c": (r"cext", None), | ||||
b'allow': (r'cext', r'pure'), | b"allow": (r"cext", r"pure"), | ||||
b'cffi': (r'cffi', None), | b"cffi": (r"cffi", None), | ||||
b'cffi-allow': (r'cffi', r'pure'), | b"cffi-allow": (r"cffi", r"pure"), | ||||
b'py': (None, r'pure'), | b"py": (None, r"pure"), | ||||
} | } | ||||
try: | try: | ||||
from . import __modulepolicy__ | from . import __modulepolicy__ | ||||
policy = __modulepolicy__.modulepolicy | policy = __modulepolicy__.modulepolicy | ||||
except ImportError: | except ImportError: | ||||
pass | pass | ||||
# PyPy doesn't load C extensions. | # PyPy doesn't load C extensions. | ||||
# | # | ||||
# The canonical way to do this is to test platform.python_implementation(). | # The canonical way to do this is to test platform.python_implementation(). | ||||
# But we don't import platform and don't bloat for it here. | # But we don't import platform and don't bloat for it here. | ||||
if r'__pypy__' in sys.builtin_module_names: | if r"__pypy__" in sys.builtin_module_names: | ||||
policy = b'cffi' | policy = b"cffi" | ||||
# Environment variable can always force settings. | # Environment variable can always force settings. | ||||
if sys.version_info[0] >= 3: | if sys.version_info[0] >= 3: | ||||
if r'HGMODULEPOLICY' in os.environ: | if r"HGMODULEPOLICY" in os.environ: | ||||
policy = os.environ[r'HGMODULEPOLICY'].encode(r'utf-8') | policy = os.environ[r"HGMODULEPOLICY"].encode(r"utf-8") | ||||
else: | else: | ||||
policy = os.environ.get(r'HGMODULEPOLICY', policy) | policy = os.environ.get(r"HGMODULEPOLICY", policy) | ||||
def _importfrom(pkgname, modname): | def _importfrom(pkgname, modname): | ||||
# from .<pkgname> import <modname> (where . is looked through this module) | # from .<pkgname> import <modname> (where . is looked through this module) | ||||
fakelocals = {} | fakelocals = {} | ||||
pkg = __import__(pkgname, globals(), fakelocals, [modname], level=1) | pkg = __import__(pkgname, globals(), fakelocals, [modname], level=1) | ||||
try: | try: | ||||
fakelocals[modname] = mod = getattr(pkg, modname) | fakelocals[modname] = mod = getattr(pkg, modname) | ||||
except AttributeError: | except AttributeError: | ||||
raise ImportError(r'cannot import name %s' % modname) | raise ImportError(r"cannot import name %s" % modname) | ||||
# force import; fakelocals[modname] may be replaced with the real module | # force import; fakelocals[modname] may be replaced with the real module | ||||
getattr(mod, r'__doc__', None) | getattr(mod, r"__doc__", None) | ||||
return fakelocals[modname] | return fakelocals[modname] | ||||
# keep in sync with "version" in C modules | # keep in sync with "version" in C modules | ||||
_cextversions = { | _cextversions = { | ||||
(r'cext', r'base85'): 1, | (r"cext", r"base85"): 1, | ||||
(r'cext', r'bdiff'): 3, | (r"cext", r"bdiff"): 3, | ||||
(r'cext', r'mpatch'): 1, | (r"cext", r"mpatch"): 1, | ||||
(r'cext', r'osutil'): 4, | (r"cext", r"osutil"): 4, | ||||
(r'cext', r'parsers'): 11, | (r"cext", r"parsers"): 11, | ||||
} | } | ||||
# map import request to other package or module | # map import request to other package or module | ||||
_modredirects = { | _modredirects = { | ||||
(r'cext', r'charencode'): (r'cext', r'parsers'), | (r"cext", r"charencode"): (r"cext", r"parsers"), | ||||
(r'cffi', r'base85'): (r'pure', r'base85'), | (r"cffi", r"base85"): (r"pure", r"base85"), | ||||
(r'cffi', r'charencode'): (r'pure', r'charencode'), | (r"cffi", r"charencode"): (r"pure", r"charencode"), | ||||
(r'cffi', r'parsers'): (r'pure', r'parsers'), | (r"cffi", r"parsers"): (r"pure", r"parsers"), | ||||
} | } | ||||
def _checkmod(pkgname, modname, mod): | def _checkmod(pkgname, modname, mod): | ||||
expected = _cextversions.get((pkgname, modname)) | expected = _cextversions.get((pkgname, modname)) | ||||
actual = getattr(mod, r'version', None) | actual = getattr(mod, r"version", None) | ||||
if actual != expected: | if actual != expected: | ||||
raise ImportError(r'cannot import module %s.%s ' | raise ImportError( | ||||
r'(expected version: %d, actual: %r)' | r"cannot import module %s.%s " | ||||
% (pkgname, modname, expected, actual)) | r"(expected version: %d, actual: %r)" | ||||
% (pkgname, modname, expected, actual) | |||||
) | |||||
def importmod(modname): | def importmod(modname): | ||||
"""Import module according to policy and check API version""" | """Import module according to policy and check API version""" | ||||
try: | try: | ||||
verpkg, purepkg = _packageprefs[policy] | verpkg, purepkg = _packageprefs[policy] | ||||
except KeyError: | except KeyError: | ||||
raise ImportError(r'invalid HGMODULEPOLICY %r' % policy) | raise ImportError(r"invalid HGMODULEPOLICY %r" % policy) | ||||
assert verpkg or purepkg | assert verpkg or purepkg | ||||
if verpkg: | if verpkg: | ||||
pn, mn = _modredirects.get((verpkg, modname), (verpkg, modname)) | pn, mn = _modredirects.get((verpkg, modname), (verpkg, modname)) | ||||
try: | try: | ||||
mod = _importfrom(pn, mn) | mod = _importfrom(pn, mn) | ||||
if pn == verpkg: | if pn == verpkg: | ||||
_checkmod(pn, mn, mod) | _checkmod(pn, mn, mod) | ||||
return mod | return mod | ||||
except ImportError: | except ImportError: | ||||
if not purepkg: | if not purepkg: | ||||
raise | raise | ||||
pn, mn = _modredirects.get((purepkg, modname), (purepkg, modname)) | pn, mn = _modredirects.get((purepkg, modname), (purepkg, modname)) | ||||
return _importfrom(pn, mn) | return _importfrom(pn, mn) |
# pushkey.py - dispatching for pushing and pulling keys | # pushkey.py - dispatching for pushing and pulling keys | ||||
# | # | ||||
# Copyright 2010 Matt Mackall <mpm@selenic.com> | # Copyright 2010 Matt Mackall <mpm@selenic.com> | ||||
# | # | ||||
# This software may be used and distributed according to the terms of the | # This software may be used and distributed according to the terms of the | ||||
# GNU General Public License version 2 or any later version. | # GNU General Public License version 2 or any later version. | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
from . import ( | from . import bookmarks, encoding, obsolete, phases | ||||
bookmarks, | |||||
encoding, | |||||
obsolete, | |||||
phases, | |||||
) | |||||
def _nslist(repo): | def _nslist(repo): | ||||
n = {} | n = {} | ||||
for k in _namespaces: | for k in _namespaces: | ||||
n[k] = "" | n[k] = "" | ||||
if not obsolete.isenabled(repo, obsolete.exchangeopt): | if not obsolete.isenabled(repo, obsolete.exchangeopt): | ||||
n.pop('obsolete') | n.pop("obsolete") | ||||
return n | return n | ||||
_namespaces = {"namespaces": (lambda *x: False, _nslist), | |||||
_namespaces = { | |||||
"namespaces": (lambda *x: False, _nslist), | |||||
"bookmarks": (bookmarks.pushbookmark, bookmarks.listbookmarks), | "bookmarks": (bookmarks.pushbookmark, bookmarks.listbookmarks), | ||||
"phases": (phases.pushphase, phases.listphases), | "phases": (phases.pushphase, phases.listphases), | ||||
"obsolete": (obsolete.pushmarker, obsolete.listmarkers), | "obsolete": (obsolete.pushmarker, obsolete.listmarkers), | ||||
} | } | ||||
def register(namespace, pushkey, listkeys): | def register(namespace, pushkey, listkeys): | ||||
_namespaces[namespace] = (pushkey, listkeys) | _namespaces[namespace] = (pushkey, listkeys) | ||||
def _get(namespace): | def _get(namespace): | ||||
return _namespaces.get(namespace, (lambda *x: False, lambda *x: {})) | return _namespaces.get(namespace, (lambda *x: False, lambda *x: {})) | ||||
def push(repo, namespace, key, old, new): | def push(repo, namespace, key, old, new): | ||||
'''should succeed iff value was old''' | """should succeed iff value was old""" | ||||
pk = _get(namespace)[0] | pk = _get(namespace)[0] | ||||
return pk(repo, key, old, new) | return pk(repo, key, old, new) | ||||
def list(repo, namespace): | def list(repo, namespace): | ||||
'''return a dict''' | """return a dict""" | ||||
lk = _get(namespace)[1] | lk = _get(namespace)[1] | ||||
return lk(repo) | return lk(repo) | ||||
encode = encoding.fromlocal | encode = encoding.fromlocal | ||||
decode = encoding.tolocal | decode = encoding.tolocal | ||||
def encodekeys(keys): | def encodekeys(keys): | ||||
"""encode the content of a pushkey namespace for exchange over the wire""" | """encode the content of a pushkey namespace for exchange over the wire""" | ||||
return '\n'.join(['%s\t%s' % (encode(k), encode(v)) for k, v in keys]) | return "\n".join(["%s\t%s" % (encode(k), encode(v)) for k, v in keys]) | ||||
def decodekeys(data): | def decodekeys(data): | ||||
"""decode the content of a pushkey namespace from exchange over the wire""" | """decode the content of a pushkey namespace from exchange over the wire""" | ||||
result = {} | result = {} | ||||
for l in data.splitlines(): | for l in data.splitlines(): | ||||
k, v = l.split('\t') | k, v = l.split("\t") | ||||
result[decode(k)] = decode(v) | result[decode(k)] = decode(v) | ||||
return result | return result |
# rcutil.py - utilities about config paths, special config sections etc. | # rcutil.py - utilities about config paths, special config sections etc. | ||||
# | # | ||||
# Copyright Mercurial Contributors | # Copyright Mercurial Contributors | ||||
# | # | ||||
# This software may be used and distributed according to the terms of the | # This software may be used and distributed according to the terms of the | ||||
# GNU General Public License version 2 or any later version. | # GNU General Public License version 2 or any later version. | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
import os | import os | ||||
from . import ( | from . import encoding, pycompat, util | ||||
encoding, | |||||
pycompat, | |||||
util, | |||||
) | |||||
if pycompat.iswindows: | if pycompat.iswindows: | ||||
from . import scmwindows as scmplatform | from . import scmwindows as scmplatform | ||||
else: | else: | ||||
from . import scmposix as scmplatform | from . import scmposix as scmplatform | ||||
fallbackpager = scmplatform.fallbackpager | fallbackpager = scmplatform.fallbackpager | ||||
systemrcpath = scmplatform.systemrcpath | systemrcpath = scmplatform.systemrcpath | ||||
userrcpath = scmplatform.userrcpath | userrcpath = scmplatform.userrcpath | ||||
def _expandrcpath(path): | def _expandrcpath(path): | ||||
'''path could be a file or a directory. return a list of file paths''' | """path could be a file or a directory. return a list of file paths""" | ||||
p = util.expandpath(path) | p = util.expandpath(path) | ||||
if os.path.isdir(p): | if os.path.isdir(p): | ||||
join = os.path.join | join = os.path.join | ||||
return [join(p, f) for f, k in util.listdir(p) if f.endswith('.rc')] | return [join(p, f) for f, k in util.listdir(p) if f.endswith(".rc")] | ||||
return [p] | return [p] | ||||
def envrcitems(env=None): | def envrcitems(env=None): | ||||
'''Return [(section, name, value, source)] config items. | """Return [(section, name, value, source)] config items. | ||||
The config items are extracted from environment variables specified by env, | The config items are extracted from environment variables specified by env, | ||||
used to override systemrc, but not userrc. | used to override systemrc, but not userrc. | ||||
If env is not provided, encoding.environ will be used. | If env is not provided, encoding.environ will be used. | ||||
''' | """ | ||||
if env is None: | if env is None: | ||||
env = encoding.environ | env = encoding.environ | ||||
checklist = [ | checklist = [ | ||||
('EDITOR', 'ui', 'editor'), | ("EDITOR", "ui", "editor"), | ||||
('VISUAL', 'ui', 'editor'), | ("VISUAL", "ui", "editor"), | ||||
('PAGER', 'pager', 'pager'), | ("PAGER", "pager", "pager"), | ||||
] | ] | ||||
result = [] | result = [] | ||||
for envname, section, configname in checklist: | for envname, section, configname in checklist: | ||||
if envname not in env: | if envname not in env: | ||||
continue | continue | ||||
result.append((section, configname, env[envname], '$%s' % envname)) | result.append((section, configname, env[envname], "$%s" % envname)) | ||||
return result | return result | ||||
def defaultrcpath(): | def defaultrcpath(): | ||||
'''return rc paths in default.d''' | """return rc paths in default.d""" | ||||
path = [] | path = [] | ||||
defaultpath = os.path.join(util.datapath, 'default.d') | defaultpath = os.path.join(util.datapath, "default.d") | ||||
if os.path.isdir(defaultpath): | if os.path.isdir(defaultpath): | ||||
path = _expandrcpath(defaultpath) | path = _expandrcpath(defaultpath) | ||||
return path | return path | ||||
def rccomponents(): | def rccomponents(): | ||||
'''return an ordered [(type, obj)] about where to load configs. | """return an ordered [(type, obj)] about where to load configs. | ||||
respect $HGRCPATH. if $HGRCPATH is empty, only .hg/hgrc of current repo is | respect $HGRCPATH. if $HGRCPATH is empty, only .hg/hgrc of current repo is | ||||
used. if $HGRCPATH is not set, the platform default will be used. | used. if $HGRCPATH is not set, the platform default will be used. | ||||
if a directory is provided, *.rc files under it will be used. | if a directory is provided, *.rc files under it will be used. | ||||
type could be either 'path' or 'items', if type is 'path', obj is a string, | type could be either 'path' or 'items', if type is 'path', obj is a string, | ||||
and is the config file path. if type is 'items', obj is a list of (section, | and is the config file path. if type is 'items', obj is a list of (section, | ||||
name, value, source) that should fill the config directly. | name, value, source) that should fill the config directly. | ||||
''' | """ | ||||
envrc = ('items', envrcitems()) | envrc = ("items", envrcitems()) | ||||
if 'HGRCPATH' in encoding.environ: | if "HGRCPATH" in encoding.environ: | ||||
# assume HGRCPATH is all about user configs so environments can be | # assume HGRCPATH is all about user configs so environments can be | ||||
# overridden. | # overridden. | ||||
_rccomponents = [envrc] | _rccomponents = [envrc] | ||||
for p in encoding.environ['HGRCPATH'].split(pycompat.ospathsep): | for p in encoding.environ["HGRCPATH"].split(pycompat.ospathsep): | ||||
if not p: | if not p: | ||||
continue | continue | ||||
_rccomponents.extend(('path', p) for p in _expandrcpath(p)) | _rccomponents.extend(("path", p) for p in _expandrcpath(p)) | ||||
else: | else: | ||||
normpaths = lambda paths: [('path', os.path.normpath(p)) for p in paths] | normpaths = lambda paths: [("path", os.path.normpath(p)) for p in paths] | ||||
_rccomponents = normpaths(defaultrcpath() + systemrcpath()) | _rccomponents = normpaths(defaultrcpath() + systemrcpath()) | ||||
_rccomponents.append(envrc) | _rccomponents.append(envrc) | ||||
_rccomponents.extend(normpaths(userrcpath())) | _rccomponents.extend(normpaths(userrcpath())) | ||||
return _rccomponents | return _rccomponents | ||||
def defaultpagerenv(): | def defaultpagerenv(): | ||||
'''return a dict of default environment variables and their values, | """return a dict of default environment variables and their values, | ||||
intended to be set before starting a pager. | intended to be set before starting a pager. | ||||
''' | """ | ||||
return {'LESS': 'FRX', 'LV': '-c'} | return {"LESS": "FRX", "LV": "-c"} |
# rewriteutil.py - utility functions for rewriting changesets | # rewriteutil.py - utility functions for rewriting changesets | ||||
# | # | ||||
# Copyright 2017 Octobus <contact@octobus.net> | # Copyright 2017 Octobus <contact@octobus.net> | ||||
# | # | ||||
# This software may be used and distributed according to the terms of the | # This software may be used and distributed according to the terms of the | ||||
# GNU General Public License version 2 or any later version. | # GNU General Public License version 2 or any later version. | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
from .i18n import _ | from .i18n import _ | ||||
from . import ( | from . import error, node, obsolete, revset | ||||
error, | |||||
node, | |||||
obsolete, | |||||
revset, | |||||
) | |||||
def precheck(repo, revs, action='rewrite'): | |||||
def precheck(repo, revs, action="rewrite"): | |||||
"""check if revs can be rewritten | """check if revs can be rewritten | ||||
action is used to control the error message. | action is used to control the error message. | ||||
Make sure this function is called after taking the lock. | Make sure this function is called after taking the lock. | ||||
""" | """ | ||||
if node.nullrev in revs: | if node.nullrev in revs: | ||||
msg = _("cannot %s null changeset") % (action) | msg = _("cannot %s null changeset") % (action) | ||||
hint = _("no changeset checked out") | hint = _("no changeset checked out") | ||||
raise error.Abort(msg, hint=hint) | raise error.Abort(msg, hint=hint) | ||||
publicrevs = repo.revs('%ld and public()', revs) | publicrevs = repo.revs("%ld and public()", revs) | ||||
if len(repo[None].parents()) > 1: | if len(repo[None].parents()) > 1: | ||||
raise error.Abort(_("cannot %s while merging") % action) | raise error.Abort(_("cannot %s while merging") % action) | ||||
if publicrevs: | if publicrevs: | ||||
msg = _("cannot %s public changesets") % (action) | msg = _("cannot %s public changesets") % (action) | ||||
hint = _("see 'hg help phases' for details") | hint = _("see 'hg help phases' for details") | ||||
raise error.Abort(msg, hint=hint) | raise error.Abort(msg, hint=hint) | ||||
newunstable = disallowednewunstable(repo, revs) | newunstable = disallowednewunstable(repo, revs) | ||||
if newunstable: | if newunstable: | ||||
raise error.Abort(_("cannot %s changeset with children") % action) | raise error.Abort(_("cannot %s changeset with children") % action) | ||||
def disallowednewunstable(repo, revs): | def disallowednewunstable(repo, revs): | ||||
"""Checks whether editing the revs will create new unstable changesets and | """Checks whether editing the revs will create new unstable changesets and | ||||
are we allowed to create them. | are we allowed to create them. | ||||
To allow new unstable changesets, set the config: | To allow new unstable changesets, set the config: | ||||
`experimental.evolution.allowunstable=True` | `experimental.evolution.allowunstable=True` | ||||
""" | """ | ||||
allowunstable = obsolete.isenabled(repo, obsolete.allowunstableopt) | allowunstable = obsolete.isenabled(repo, obsolete.allowunstableopt) | ||||
if allowunstable: | if allowunstable: | ||||
return revset.baseset() | return revset.baseset() | ||||
return repo.revs("(%ld::) - %ld", revs, revs) | return repo.revs("(%ld::) - %ld", revs, revs) |
from __future__ import absolute_import | from __future__ import absolute_import | ||||
import array | import array | ||||
import errno | import errno | ||||
import fcntl | import fcntl | ||||
import os | import os | ||||
import sys | import sys | ||||
from . import ( | from . import encoding, pycompat, util | ||||
encoding, | |||||
pycompat, | |||||
util, | |||||
) | |||||
# BSD 'more' escapes ANSI color sequences by default. This can be disabled by | # BSD 'more' escapes ANSI color sequences by default. This can be disabled by | ||||
# $MORE variable, but there's no compatible option with Linux 'more'. Given | # $MORE variable, but there's no compatible option with Linux 'more'. Given | ||||
# OS X is widely used and most modern Unix systems would have 'less', setting | # OS X is widely used and most modern Unix systems would have 'less', setting | ||||
# 'less' as the default seems reasonable. | # 'less' as the default seems reasonable. | ||||
fallbackpager = 'less' | fallbackpager = "less" | ||||
def _rcfiles(path): | def _rcfiles(path): | ||||
rcs = [os.path.join(path, 'hgrc')] | rcs = [os.path.join(path, "hgrc")] | ||||
rcdir = os.path.join(path, 'hgrc.d') | rcdir = os.path.join(path, "hgrc.d") | ||||
try: | try: | ||||
rcs.extend([os.path.join(rcdir, f) | rcs.extend( | ||||
[ | |||||
os.path.join(rcdir, f) | |||||
for f, kind in util.listdir(rcdir) | for f, kind in util.listdir(rcdir) | ||||
if f.endswith(".rc")]) | if f.endswith(".rc") | ||||
] | |||||
) | |||||
except OSError: | except OSError: | ||||
pass | pass | ||||
return rcs | return rcs | ||||
def systemrcpath(): | def systemrcpath(): | ||||
path = [] | path = [] | ||||
if pycompat.sysplatform == 'plan9': | if pycompat.sysplatform == "plan9": | ||||
root = 'lib/mercurial' | root = "lib/mercurial" | ||||
else: | else: | ||||
root = 'etc/mercurial' | root = "etc/mercurial" | ||||
# old mod_python does not set sys.argv | # old mod_python does not set sys.argv | ||||
if len(getattr(sys, 'argv', [])) > 0: | if len(getattr(sys, "argv", [])) > 0: | ||||
p = os.path.dirname(os.path.dirname(pycompat.sysargv[0])) | p = os.path.dirname(os.path.dirname(pycompat.sysargv[0])) | ||||
if p != '/': | if p != "/": | ||||
path.extend(_rcfiles(os.path.join(p, root))) | path.extend(_rcfiles(os.path.join(p, root))) | ||||
path.extend(_rcfiles('/' + root)) | path.extend(_rcfiles("/" + root)) | ||||
return path | return path | ||||
def userrcpath(): | def userrcpath(): | ||||
if pycompat.sysplatform == 'plan9': | if pycompat.sysplatform == "plan9": | ||||
return [encoding.environ['home'] + '/lib/hgrc'] | return [encoding.environ["home"] + "/lib/hgrc"] | ||||
elif pycompat.isdarwin: | elif pycompat.isdarwin: | ||||
return [os.path.expanduser('~/.hgrc')] | return [os.path.expanduser("~/.hgrc")] | ||||
else: | else: | ||||
confighome = encoding.environ.get('XDG_CONFIG_HOME') | confighome = encoding.environ.get("XDG_CONFIG_HOME") | ||||
if confighome is None or not os.path.isabs(confighome): | if confighome is None or not os.path.isabs(confighome): | ||||
confighome = os.path.expanduser('~/.config') | confighome = os.path.expanduser("~/.config") | ||||
return [ | |||||
os.path.expanduser("~/.hgrc"), | |||||
os.path.join(confighome, "hg", "hgrc"), | |||||
] | |||||
return [os.path.expanduser('~/.hgrc'), | |||||
os.path.join(confighome, 'hg', 'hgrc')] | |||||
def termsize(ui): | def termsize(ui): | ||||
try: | try: | ||||
import termios | import termios | ||||
TIOCGWINSZ = termios.TIOCGWINSZ # unavailable on IRIX (issue3449) | TIOCGWINSZ = termios.TIOCGWINSZ # unavailable on IRIX (issue3449) | ||||
except (AttributeError, ImportError): | except (AttributeError, ImportError): | ||||
return 80, 24 | return 80, 24 | ||||
for dev in (ui.ferr, ui.fout, ui.fin): | for dev in (ui.ferr, ui.fout, ui.fin): | ||||
try: | try: | ||||
try: | try: | ||||
fd = dev.fileno() | fd = dev.fileno() | ||||
except AttributeError: | except AttributeError: | ||||
continue | continue | ||||
if not os.isatty(fd): | if not os.isatty(fd): | ||||
continue | continue | ||||
arri = fcntl.ioctl(fd, TIOCGWINSZ, '\0' * 8) | arri = fcntl.ioctl(fd, TIOCGWINSZ, "\0" * 8) | ||||
height, width = array.array(r'h', arri)[:2] | height, width = array.array(r"h", arri)[:2] | ||||
if width > 0 and height > 0: | if width > 0 and height > 0: | ||||
return width, height | return width, height | ||||
except ValueError: | except ValueError: | ||||
pass | pass | ||||
except IOError as e: | except IOError as e: | ||||
if e[0] == errno.EINVAL: | if e[0] == errno.EINVAL: | ||||
pass | pass | ||||
else: | else: | ||||
raise | raise | ||||
return 80, 24 | return 80, 24 |
from __future__ import absolute_import | from __future__ import absolute_import | ||||
import os | import os | ||||
from . import ( | from . import encoding, pycompat, util, win32 | ||||
encoding, | |||||
pycompat, | |||||
util, | |||||
win32, | |||||
) | |||||
try: | try: | ||||
import _winreg as winreg | import _winreg as winreg | ||||
winreg.CloseKey | winreg.CloseKey | ||||
except ImportError: | except ImportError: | ||||
import winreg | import winreg | ||||
# MS-DOS 'more' is the only pager available by default on Windows. | # MS-DOS 'more' is the only pager available by default on Windows. | ||||
fallbackpager = 'more' | fallbackpager = "more" | ||||
def systemrcpath(): | def systemrcpath(): | ||||
'''return default os-specific hgrc search path''' | """return default os-specific hgrc search path""" | ||||
rcpath = [] | rcpath = [] | ||||
filename = win32.executablepath() | filename = win32.executablepath() | ||||
# Use mercurial.ini found in directory with hg.exe | # Use mercurial.ini found in directory with hg.exe | ||||
progrc = os.path.join(os.path.dirname(filename), 'mercurial.ini') | progrc = os.path.join(os.path.dirname(filename), "mercurial.ini") | ||||
rcpath.append(progrc) | rcpath.append(progrc) | ||||
# Use hgrc.d found in directory with hg.exe | # Use hgrc.d found in directory with hg.exe | ||||
progrcd = os.path.join(os.path.dirname(filename), 'hgrc.d') | progrcd = os.path.join(os.path.dirname(filename), "hgrc.d") | ||||
if os.path.isdir(progrcd): | if os.path.isdir(progrcd): | ||||
for f, kind in util.listdir(progrcd): | for f, kind in util.listdir(progrcd): | ||||
if f.endswith('.rc'): | if f.endswith(".rc"): | ||||
rcpath.append(os.path.join(progrcd, f)) | rcpath.append(os.path.join(progrcd, f)) | ||||
# else look for a system rcpath in the registry | # else look for a system rcpath in the registry | ||||
value = util.lookupreg('SOFTWARE\\Mercurial', None, | value = util.lookupreg( | ||||
winreg.HKEY_LOCAL_MACHINE) | "SOFTWARE\\Mercurial", None, winreg.HKEY_LOCAL_MACHINE | ||||
) | |||||
if not isinstance(value, str) or not value: | if not isinstance(value, str) or not value: | ||||
return rcpath | return rcpath | ||||
value = util.localpath(value) | value = util.localpath(value) | ||||
for p in value.split(pycompat.ospathsep): | for p in value.split(pycompat.ospathsep): | ||||
if p.lower().endswith('mercurial.ini'): | if p.lower().endswith("mercurial.ini"): | ||||
rcpath.append(p) | rcpath.append(p) | ||||
elif os.path.isdir(p): | elif os.path.isdir(p): | ||||
for f, kind in util.listdir(p): | for f, kind in util.listdir(p): | ||||
if f.endswith('.rc'): | if f.endswith(".rc"): | ||||
rcpath.append(os.path.join(p, f)) | rcpath.append(os.path.join(p, f)) | ||||
return rcpath | return rcpath | ||||
def userrcpath(): | def userrcpath(): | ||||
'''return os-specific hgrc search path to the user dir''' | """return os-specific hgrc search path to the user dir""" | ||||
home = os.path.expanduser('~') | home = os.path.expanduser("~") | ||||
path = [os.path.join(home, 'mercurial.ini'), | path = [os.path.join(home, "mercurial.ini"), os.path.join(home, ".hgrc")] | ||||
os.path.join(home, '.hgrc')] | userprofile = encoding.environ.get("USERPROFILE") | ||||
userprofile = encoding.environ.get('USERPROFILE') | |||||
if userprofile and userprofile != home: | if userprofile and userprofile != home: | ||||
path.append(os.path.join(userprofile, 'mercurial.ini')) | path.append(os.path.join(userprofile, "mercurial.ini")) | ||||
path.append(os.path.join(userprofile, '.hgrc')) | path.append(os.path.join(userprofile, ".hgrc")) | ||||
return path | return path | ||||
def termsize(ui): | def termsize(ui): | ||||
return win32.termsize() | return win32.termsize() |
# stack.py - Mercurial functions for stack definition | # stack.py - Mercurial functions for stack definition | ||||
# | # | ||||
# Copyright Matt Mackall <mpm@selenic.com> and other | # Copyright Matt Mackall <mpm@selenic.com> and other | ||||
# | # | ||||
# This software may be used and distributed according to the terms of the | # This software may be used and distributed according to the terms of the | ||||
# GNU General Public License version 2 or any later version. | # GNU General Public License version 2 or any later version. | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
from . import ( | from . import revsetlang, scmutil | ||||
revsetlang, | |||||
scmutil, | |||||
) | |||||
def getstack(repo, rev=None): | def getstack(repo, rev=None): | ||||
"""return a sorted smartrev of the stack containing either rev if it is | """return a sorted smartrev of the stack containing either rev if it is | ||||
not None or the current working directory parent. | not None or the current working directory parent. | ||||
The stack will always contain all drafts changesets which are ancestors to | The stack will always contain all drafts changesets which are ancestors to | ||||
the revision and are not merges. | the revision and are not merges. | ||||
""" | """ | ||||
if rev is None: | if rev is None: | ||||
rev = '.' | rev = "." | ||||
revspec = 'reverse(only(%s) and not public() and not ::merge())' | revspec = "reverse(only(%s) and not public() and not ::merge())" | ||||
revset = revsetlang.formatspec(revspec, rev) | revset = revsetlang.formatspec(revspec, rev) | ||||
revisions = scmutil.revrange(repo, [revset]) | revisions = scmutil.revrange(repo, [revset]) | ||||
revisions.sort() | revisions.sort() | ||||
return revisions | return revisions |
.hg/ directory. | .hg/ directory. | ||||
We store the data on disk in cbor, for which we use the third party cbor library | We store the data on disk in cbor, for which we use the third party cbor library | ||||
to serialize and deserialize data. | to serialize and deserialize data. | ||||
""" | """ | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
from . import ( | from . import error, util | ||||
error, | from .utils import cborutil | ||||
util, | |||||
) | |||||
from .utils import ( | |||||
cborutil, | |||||
) | |||||
class cmdstate(object): | class cmdstate(object): | ||||
"""a wrapper class to store the state of commands like `rebase`, `graft`, | """a wrapper class to store the state of commands like `rebase`, `graft`, | ||||
`histedit`, `shelve` etc. Extensions can also use this to write state files. | `histedit`, `shelve` etc. Extensions can also use this to write state files. | ||||
All the data for the state is stored in the form of key-value pairs in a | All the data for the state is stored in the form of key-value pairs in a | ||||
dictionary. | dictionary. | ||||
return self._read() | return self._read() | ||||
def save(self, version, data): | def save(self, version, data): | ||||
"""write all the state data stored to .hg/<filename> file | """write all the state data stored to .hg/<filename> file | ||||
we use third-party library cbor to serialize data to write in the file. | we use third-party library cbor to serialize data to write in the file. | ||||
""" | """ | ||||
if not isinstance(version, int): | if not isinstance(version, int): | ||||
raise error.ProgrammingError("version of state file should be" | raise error.ProgrammingError( | ||||
" an integer") | "version of state file should be" " an integer" | ||||
) | |||||
with self._repo.vfs(self.fname, 'wb', atomictemp=True) as fp: | with self._repo.vfs(self.fname, "wb", atomictemp=True) as fp: | ||||
fp.write('%d\n' % version) | fp.write("%d\n" % version) | ||||
for chunk in cborutil.streamencode(data): | for chunk in cborutil.streamencode(data): | ||||
fp.write(chunk) | fp.write(chunk) | ||||
def _read(self): | def _read(self): | ||||
"""reads the state file and returns a dictionary which contain | """reads the state file and returns a dictionary which contain | ||||
data in the same format as it was before storing""" | data in the same format as it was before storing""" | ||||
with self._repo.vfs(self.fname, 'rb') as fp: | with self._repo.vfs(self.fname, "rb") as fp: | ||||
try: | try: | ||||
int(fp.readline()) | int(fp.readline()) | ||||
except ValueError: | except ValueError: | ||||
raise error.CorruptedState("unknown version of state file" | raise error.CorruptedState( | ||||
" found") | "unknown version of state file" " found" | ||||
) | |||||
return cborutil.decodeall(fp.read())[0] | return cborutil.decodeall(fp.read())[0] | ||||
def delete(self): | def delete(self): | ||||
"""drop the state file if exists""" | """drop the state file if exists""" | ||||
util.unlinkpath(self._repo.vfs.join(self.fname), ignoremissing=True) | util.unlinkpath(self._repo.vfs.join(self.fname), ignoremissing=True) | ||||
def exists(self): | def exists(self): | ||||
"""check whether the state file exists or not""" | """check whether the state file exists or not""" | ||||
return self._repo.vfs.exists(self.fname) | return self._repo.vfs.exists(self.fname) |
# txnutil.py - transaction related utilities | # txnutil.py - transaction related utilities | ||||
# | # | ||||
# Copyright FUJIWARA Katsunori <foozy@lares.dti.ne.jp> and others | # Copyright FUJIWARA Katsunori <foozy@lares.dti.ne.jp> and others | ||||
# | # | ||||
# This software may be used and distributed according to the terms of the | # This software may be used and distributed according to the terms of the | ||||
# GNU General Public License version 2 or any later version. | # GNU General Public License version 2 or any later version. | ||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
import errno | import errno | ||||
from . import ( | from . import encoding | ||||
encoding, | |||||
) | |||||
def mayhavepending(root): | def mayhavepending(root): | ||||
'''return whether 'root' may have pending changes, which are | """return whether 'root' may have pending changes, which are | ||||
visible to this process. | visible to this process. | ||||
''' | """ | ||||
return root == encoding.environ.get('HG_PENDING') | return root == encoding.environ.get("HG_PENDING") | ||||
def trypending(root, vfs, filename, **kwargs): | def trypending(root, vfs, filename, **kwargs): | ||||
'''Open file to be read according to HG_PENDING environment variable | """Open file to be read according to HG_PENDING environment variable | ||||
This opens '.pending' of specified 'filename' only when HG_PENDING | This opens '.pending' of specified 'filename' only when HG_PENDING | ||||
is equal to 'root'. | is equal to 'root'. | ||||
This returns '(fp, is_pending_opened)' tuple. | This returns '(fp, is_pending_opened)' tuple. | ||||
''' | """ | ||||
if mayhavepending(root): | if mayhavepending(root): | ||||
try: | try: | ||||
return (vfs('%s.pending' % filename, **kwargs), True) | return (vfs("%s.pending" % filename, **kwargs), True) | ||||
except IOError as inst: | except IOError as inst: | ||||
if inst.errno != errno.ENOENT: | if inst.errno != errno.ENOENT: | ||||
raise | raise | ||||
return (vfs(filename, **kwargs), False) | return (vfs(filename, **kwargs), False) |
[tool.black] | |||||
line-length = 80 | |||||
exclude = 'build/|wheelhouse/|dist/|packages/|\.hg/|\.mypy_cache/|\.venv/|mercurial/thirdparty/|hgext/fsmonitor/pywatchman/|contrib/python-zstandard/' |
CONTRIBUTING | CONTRIBUTING | ||||
CONTRIBUTORS | CONTRIBUTORS | ||||
COPYING | COPYING | ||||
Makefile | Makefile | ||||
README.rst | README.rst | ||||
hg | hg | ||||
hgeditor | hgeditor | ||||
hgweb.cgi | hgweb.cgi | ||||
pyproject.toml | |||||
setup.py | setup.py | ||||
Prevent adding modules which could be shadowed by ancient .so/.dylib. | Prevent adding modules which could be shadowed by ancient .so/.dylib. | ||||
$ testrepohg files \ | $ testrepohg files \ | ||||
> mercurial/base85.py \ | > mercurial/base85.py \ | ||||
> mercurial/bdiff.py \ | > mercurial/bdiff.py \ | ||||
> mercurial/diffhelpers.py \ | > mercurial/diffhelpers.py \ |
I'm surprised by this result. I'd like to think the reformatting tool would be smarter than this.