Adding upstream version 2.25.15.
Signed-off-by: Daniel Baumann <daniel.baumann@progress-linux.org>
This commit is contained in:
parent
10737b110a
commit
b543f2e88d
485 changed files with 191459 additions and 0 deletions
18
.perltidyrc
Normal file
18
.perltidyrc
Normal file
|
@ -0,0 +1,18 @@
|
|||
# -*- conf -*-
|
||||
#
|
||||
# Default options for perltidy for proper Perl code reformatting.
|
||||
#
|
||||
# This file is based on the one from the rra-c-util package,
|
||||
# which can be found at <https://www.eyrie.org/~eagle/software/rra-c-util/>.
|
||||
|
||||
-bbao # put line breaks before any operator
|
||||
-nbbc # don't force blank lines before comments (bad for else blocks)
|
||||
-ce # cuddle braces around else
|
||||
-l=79 # usually use 78, but don't want 79-long lines reformatted
|
||||
-pt=2 # don't add extra whitespace around parentheses
|
||||
-sbt=2 # ...or square brackets
|
||||
-sfs # no space before semicolon in for (not that I use this form)
|
||||
-bar # opening-brace-always-on-right
|
||||
-sot # avoid lines with isolated opening tokens
|
||||
-sct # ... same for closing tokens
|
||||
-fs # allow "perltidy, please don't touch this" sections
|
340
COPYING
Normal file
340
COPYING
Normal file
|
@ -0,0 +1,340 @@
|
|||
GNU GENERAL PUBLIC LICENSE
|
||||
Version 2, June 1991
|
||||
|
||||
Copyright (C) 1989, 1991 Free Software Foundation, Inc.
|
||||
51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The licenses for most software are designed to take away your
|
||||
freedom to share and change it. By contrast, the GNU General Public
|
||||
License is intended to guarantee your freedom to share and change free
|
||||
software--to make sure the software is free for all its users. This
|
||||
General Public License applies to most of the Free Software
|
||||
Foundation's software and to any other program whose authors commit to
|
||||
using it. (Some other Free Software Foundation software is covered by
|
||||
the GNU Library General Public License instead.) You can apply it to
|
||||
your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
this service if you wish), that you receive source code or can get it
|
||||
if you want it, that you can change the software or use pieces of it
|
||||
in new free programs; and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to make restrictions that forbid
|
||||
anyone to deny you these rights or to ask you to surrender the rights.
|
||||
These restrictions translate to certain responsibilities for you if you
|
||||
distribute copies of the software, or if you modify it.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must give the recipients all the rights that
|
||||
you have. You must make sure that they, too, receive or can get the
|
||||
source code. And you must show them these terms so they know their
|
||||
rights.
|
||||
|
||||
We protect your rights with two steps: (1) copyright the software, and
|
||||
(2) offer you this license which gives you legal permission to copy,
|
||||
distribute and/or modify the software.
|
||||
|
||||
Also, for each author's protection and ours, we want to make certain
|
||||
that everyone understands that there is no warranty for this free
|
||||
software. If the software is modified by someone else and passed on, we
|
||||
want its recipients to know that what they have is not the original, so
|
||||
that any problems introduced by others will not reflect on the original
|
||||
authors' reputations.
|
||||
|
||||
Finally, any free program is threatened constantly by software
|
||||
patents. We wish to avoid the danger that redistributors of a free
|
||||
program will individually obtain patent licenses, in effect making the
|
||||
program proprietary. To prevent this, we have made it clear that any
|
||||
patent must be licensed for everyone's free use or not licensed at all.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
GNU GENERAL PUBLIC LICENSE
|
||||
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
|
||||
|
||||
0. This License applies to any program or other work which contains
|
||||
a notice placed by the copyright holder saying it may be distributed
|
||||
under the terms of this General Public License. The "Program", below,
|
||||
refers to any such program or work, and a "work based on the Program"
|
||||
means either the Program or any derivative work under copyright law:
|
||||
that is to say, a work containing the Program or a portion of it,
|
||||
either verbatim or with modifications and/or translated into another
|
||||
language. (Hereinafter, translation is included without limitation in
|
||||
the term "modification".) Each licensee is addressed as "you".
|
||||
|
||||
Activities other than copying, distribution and modification are not
|
||||
covered by this License; they are outside its scope. The act of
|
||||
running the Program is not restricted, and the output from the Program
|
||||
is covered only if its contents constitute a work based on the
|
||||
Program (independent of having been made by running the Program).
|
||||
Whether that is true depends on what the Program does.
|
||||
|
||||
1. You may copy and distribute verbatim copies of the Program's
|
||||
source code as you receive it, in any medium, provided that you
|
||||
conspicuously and appropriately publish on each copy an appropriate
|
||||
copyright notice and disclaimer of warranty; keep intact all the
|
||||
notices that refer to this License and to the absence of any warranty;
|
||||
and give any other recipients of the Program a copy of this License
|
||||
along with the Program.
|
||||
|
||||
You may charge a fee for the physical act of transferring a copy, and
|
||||
you may at your option offer warranty protection in exchange for a fee.
|
||||
|
||||
2. You may modify your copy or copies of the Program or any portion
|
||||
of it, thus forming a work based on the Program, and copy and
|
||||
distribute such modifications or work under the terms of Section 1
|
||||
above, provided that you also meet all of these conditions:
|
||||
|
||||
a) You must cause the modified files to carry prominent notices
|
||||
stating that you changed the files and the date of any change.
|
||||
|
||||
b) You must cause any work that you distribute or publish, that in
|
||||
whole or in part contains or is derived from the Program or any
|
||||
part thereof, to be licensed as a whole at no charge to all third
|
||||
parties under the terms of this License.
|
||||
|
||||
c) If the modified program normally reads commands interactively
|
||||
when run, you must cause it, when started running for such
|
||||
interactive use in the most ordinary way, to print or display an
|
||||
announcement including an appropriate copyright notice and a
|
||||
notice that there is no warranty (or else, saying that you provide
|
||||
a warranty) and that users may redistribute the program under
|
||||
these conditions, and telling the user how to view a copy of this
|
||||
License. (Exception: if the Program itself is interactive but
|
||||
does not normally print such an announcement, your work based on
|
||||
the Program is not required to print an announcement.)
|
||||
|
||||
These requirements apply to the modified work as a whole. If
|
||||
identifiable sections of that work are not derived from the Program,
|
||||
and can be reasonably considered independent and separate works in
|
||||
themselves, then this License, and its terms, do not apply to those
|
||||
sections when you distribute them as separate works. But when you
|
||||
distribute the same sections as part of a whole which is a work based
|
||||
on the Program, the distribution of the whole must be on the terms of
|
||||
this License, whose permissions for other licensees extend to the
|
||||
entire whole, and thus to each and every part regardless of who wrote it.
|
||||
|
||||
Thus, it is not the intent of this section to claim rights or contest
|
||||
your rights to work written entirely by you; rather, the intent is to
|
||||
exercise the right to control the distribution of derivative or
|
||||
collective works based on the Program.
|
||||
|
||||
In addition, mere aggregation of another work not based on the Program
|
||||
with the Program (or with a work based on the Program) on a volume of
|
||||
a storage or distribution medium does not bring the other work under
|
||||
the scope of this License.
|
||||
|
||||
3. You may copy and distribute the Program (or a work based on it,
|
||||
under Section 2) in object code or executable form under the terms of
|
||||
Sections 1 and 2 above provided that you also do one of the following:
|
||||
|
||||
a) Accompany it with the complete corresponding machine-readable
|
||||
source code, which must be distributed under the terms of Sections
|
||||
1 and 2 above on a medium customarily used for software interchange; or,
|
||||
|
||||
b) Accompany it with a written offer, valid for at least three
|
||||
years, to give any third party, for a charge no more than your
|
||||
cost of physically performing source distribution, a complete
|
||||
machine-readable copy of the corresponding source code, to be
|
||||
distributed under the terms of Sections 1 and 2 above on a medium
|
||||
customarily used for software interchange; or,
|
||||
|
||||
c) Accompany it with the information you received as to the offer
|
||||
to distribute corresponding source code. (This alternative is
|
||||
allowed only for noncommercial distribution and only if you
|
||||
received the program in object code or executable form with such
|
||||
an offer, in accord with Subsection b above.)
|
||||
|
||||
The source code for a work means the preferred form of the work for
|
||||
making modifications to it. For an executable work, complete source
|
||||
code means all the source code for all modules it contains, plus any
|
||||
associated interface definition files, plus the scripts used to
|
||||
control compilation and installation of the executable. However, as a
|
||||
special exception, the source code distributed need not include
|
||||
anything that is normally distributed (in either source or binary
|
||||
form) with the major components (compiler, kernel, and so on) of the
|
||||
operating system on which the executable runs, unless that component
|
||||
itself accompanies the executable.
|
||||
|
||||
If distribution of executable or object code is made by offering
|
||||
access to copy from a designated place, then offering equivalent
|
||||
access to copy the source code from the same place counts as
|
||||
distribution of the source code, even though third parties are not
|
||||
compelled to copy the source along with the object code.
|
||||
|
||||
4. You may not copy, modify, sublicense, or distribute the Program
|
||||
except as expressly provided under this License. Any attempt
|
||||
otherwise to copy, modify, sublicense or distribute the Program is
|
||||
void, and will automatically terminate your rights under this License.
|
||||
However, parties who have received copies, or rights, from you under
|
||||
this License will not have their licenses terminated so long as such
|
||||
parties remain in full compliance.
|
||||
|
||||
5. You are not required to accept this License, since you have not
|
||||
signed it. However, nothing else grants you permission to modify or
|
||||
distribute the Program or its derivative works. These actions are
|
||||
prohibited by law if you do not accept this License. Therefore, by
|
||||
modifying or distributing the Program (or any work based on the
|
||||
Program), you indicate your acceptance of this License to do so, and
|
||||
all its terms and conditions for copying, distributing or modifying
|
||||
the Program or works based on it.
|
||||
|
||||
6. Each time you redistribute the Program (or any work based on the
|
||||
Program), the recipient automatically receives a license from the
|
||||
original licensor to copy, distribute or modify the Program subject to
|
||||
these terms and conditions. You may not impose any further
|
||||
restrictions on the recipients' exercise of the rights granted herein.
|
||||
You are not responsible for enforcing compliance by third parties to
|
||||
this License.
|
||||
|
||||
7. If, as a consequence of a court judgment or allegation of patent
|
||||
infringement or for any other reason (not limited to patent issues),
|
||||
conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot
|
||||
distribute so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you
|
||||
may not distribute the Program at all. For example, if a patent
|
||||
license would not permit royalty-free redistribution of the Program by
|
||||
all those who receive copies directly or indirectly through you, then
|
||||
the only way you could satisfy both it and this License would be to
|
||||
refrain entirely from distribution of the Program.
|
||||
|
||||
If any portion of this section is held invalid or unenforceable under
|
||||
any particular circumstance, the balance of the section is intended to
|
||||
apply and the section as a whole is intended to apply in other
|
||||
circumstances.
|
||||
|
||||
It is not the purpose of this section to induce you to infringe any
|
||||
patents or other property right claims or to contest validity of any
|
||||
such claims; this section has the sole purpose of protecting the
|
||||
integrity of the free software distribution system, which is
|
||||
implemented by public license practices. Many people have made
|
||||
generous contributions to the wide range of software distributed
|
||||
through that system in reliance on consistent application of that
|
||||
system; it is up to the author/donor to decide if he or she is willing
|
||||
to distribute software through any other system and a licensee cannot
|
||||
impose that choice.
|
||||
|
||||
This section is intended to make thoroughly clear what is believed to
|
||||
be a consequence of the rest of this License.
|
||||
|
||||
8. If the distribution and/or use of the Program is restricted in
|
||||
certain countries either by patents or by copyrighted interfaces, the
|
||||
original copyright holder who places the Program under this License
|
||||
may add an explicit geographical distribution limitation excluding
|
||||
those countries, so that distribution is permitted only in or among
|
||||
countries not thus excluded. In such case, this License incorporates
|
||||
the limitation as if written in the body of this License.
|
||||
|
||||
9. The Free Software Foundation may publish revised and/or new versions
|
||||
of the General Public License from time to time. Such new versions will
|
||||
be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the Program
|
||||
specifies a version number of this License which applies to it and "any
|
||||
later version", you have the option of following the terms and conditions
|
||||
either of that version or of any later version published by the Free
|
||||
Software Foundation. If the Program does not specify a version number of
|
||||
this License, you may choose any version ever published by the Free Software
|
||||
Foundation.
|
||||
|
||||
10. If you wish to incorporate parts of the Program into other free
|
||||
programs whose distribution conditions are different, write to the author
|
||||
to ask for permission. For software which is copyrighted by the Free
|
||||
Software Foundation, write to the Free Software Foundation; we sometimes
|
||||
make exceptions for this. Our decision will be guided by the two goals
|
||||
of preserving the free status of all derivatives of our free software and
|
||||
of promoting the sharing and reuse of software generally.
|
||||
|
||||
NO WARRANTY
|
||||
|
||||
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
|
||||
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
|
||||
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
|
||||
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
|
||||
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
|
||||
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
|
||||
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
|
||||
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
|
||||
REPAIR OR CORRECTION.
|
||||
|
||||
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
|
||||
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
|
||||
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
|
||||
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
|
||||
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
|
||||
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
|
||||
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
|
||||
POSSIBILITY OF SUCH DAMAGES.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
convey the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) 19yy <name of author>
|
||||
|
||||
This program is free software; you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation; either version 2 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program; if not, write to the Free Software
|
||||
Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA
|
||||
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If the program is interactive, make it output a short notice like this
|
||||
when it starts in an interactive mode:
|
||||
|
||||
Gnomovision version 69, Copyright (C) 19yy name of author
|
||||
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||
parts of the General Public License. Of course, the commands you use may
|
||||
be called something other than `show w' and `show c'; they could even be
|
||||
mouse-clicks or menu items--whatever suits your program.
|
||||
|
||||
You should also get your employer (if you work as a programmer) or your
|
||||
school, if any, to sign a "copyright disclaimer" for the program, if
|
||||
necessary. Here is a sample; alter the names:
|
||||
|
||||
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
|
||||
`Gnomovision' (which makes passes at compilers) written by James Hacker.
|
||||
|
||||
<signature of Ty Coon>, 1 April 1989
|
||||
Ty Coon, President of Vice
|
||||
|
||||
This General Public License does not permit incorporating your program into
|
||||
proprietary programs. If your program is a subroutine library, you may
|
||||
consider it more useful to permit linking proprietary applications with the
|
||||
library. If this is what you want to do, use the GNU Library General
|
||||
Public License instead of this License.
|
77
Makefile
Normal file
77
Makefile
Normal file
|
@ -0,0 +1,77 @@
|
|||
# Simplified Makefile for devscripts
|
||||
|
||||
include Makefile.common
|
||||
|
||||
DESTDIR =
|
||||
|
||||
all: version doc make_scripts conf.default translated_manpages
|
||||
|
||||
version:
|
||||
rm -f version
|
||||
dpkg-parsechangelog -SVersion > version
|
||||
|
||||
conf.default: conf.default.in version
|
||||
rm -f $@ $@.tmp
|
||||
VERSION=`cat version` && sed -e "s/###VERSION###/$$VERSION/" $< \
|
||||
> $@.tmp && mv $@.tmp $@
|
||||
|
||||
translated_manpages:
|
||||
$(MAKE) -C po4a/
|
||||
touch translated_manpages
|
||||
|
||||
clean_translated_manpages:
|
||||
# Update the POT/POs and remove the translated man pages
|
||||
$(MAKE) -C po4a/ clean
|
||||
rm -f translated_manpages
|
||||
|
||||
clean: clean_scripts clean_doc clean_translated_manpages
|
||||
rm -f version conf.default make_scripts
|
||||
|
||||
doc:
|
||||
$(MAKE) -C doc
|
||||
|
||||
online-test:
|
||||
$(MAKE) -C test/ online-test
|
||||
|
||||
destructive-test:
|
||||
$(MAKE) -C test/ $@
|
||||
|
||||
destructive-test-installed:
|
||||
$(MAKE) -C test/ $@
|
||||
|
||||
test:
|
||||
$(MAKE) test_scripts
|
||||
$(MAKE) test_test
|
||||
|
||||
test-installed:
|
||||
$(MAKE) -C test/ $@
|
||||
|
||||
install: all install_scripts install_doc
|
||||
install -d "$(DESTDIR)$(PERLMOD_DIR)" \
|
||||
"$(DESTDIR)$(DATA_DIR)" "$(DESTDIR)$(TEMPLATES_DIR)" \
|
||||
"$(DESTDIR)$(DOCDIR)"
|
||||
for f in lib/*; do cp -a "$$f" "$(DESTDIR)$(PERLMOD_DIR)"; done
|
||||
install -m0644 conf.default "$(DESTDIR)$(DATA_DIR)"
|
||||
install -m0644 templates/README.mk-build-deps "$(DESTDIR)$(TEMPLATES_DIR)"
|
||||
install -m0644 README.md "$(DESTDIR)$(DOCDIR)"
|
||||
ln -sf edit-patch.1 "$(DESTDIR)$(MAN1DIR)/add-patch.1"
|
||||
|
||||
test_test:
|
||||
$(MAKE) -C test/ test
|
||||
|
||||
make_scripts: version
|
||||
$(MAKE) -C scripts/
|
||||
touch $@
|
||||
clean_scripts: clean_translated_manpages
|
||||
$(MAKE) -C scripts/ clean
|
||||
test_scripts:
|
||||
$(MAKE) -C scripts/ test
|
||||
install_scripts:
|
||||
$(MAKE) -C scripts/ install DESTDIR=$(DESTDIR)
|
||||
|
||||
clean_doc: clean_translated_manpages
|
||||
$(MAKE) -C doc clean
|
||||
install_doc:
|
||||
$(MAKE) -C doc install DESTDIR=$(DESTDIR)
|
||||
|
||||
.PHONY: online-test test test-installed
|
17
Makefile.common
Normal file
17
Makefile.common
Normal file
|
@ -0,0 +1,17 @@
|
|||
GEN_MAN1S := bts.1 build-rdeps.1 chdist.1 debcheckout.1 debcommit.1 \
|
||||
deb-reversion.1 dget.1 mass-bug.1 \
|
||||
mk-build-deps.1 mk-origtargz.1 namecheck.1 rmadison.1 sadt.1 svnpath.1 \
|
||||
uscan.1 salsa.1 \
|
||||
tagpending.1 origtargz.1 transition-check.1 who-permits-upload.1 \
|
||||
git-deborig.1 hardening-check.1
|
||||
|
||||
PREFIX = /usr
|
||||
BINDIR = $(PREFIX)/bin
|
||||
DOCDIR = $(PREFIX)/share/doc/devscripts
|
||||
MAN1DIR = $(PREFIX)/share/man/man1
|
||||
MAN5DIR = $(PREFIX)/share/man/man5
|
||||
MAN7DIR = $(PREFIX)/share/man/man7
|
||||
PERLMOD_DIR = $(shell perl -MConfig -e 'print $$Config{vendorlib}')
|
||||
DATA_DIR = $(PREFIX)/share/devscripts
|
||||
TEMPLATES_DIR = $(DATA_DIR)/templates
|
||||
SYSCONFDIR = /etc
|
333
README.md
Normal file
333
README.md
Normal file
|
@ -0,0 +1,333 @@
|
|||
README for Debian devscripts package
|
||||
====================================
|
||||
|
||||
Devscripts provides several scripts which may be of use to Debian
|
||||
developers. The following gives a summary of the available scripts --
|
||||
please read the manpages for full details about the use of these
|
||||
scripts. They are contributed by multiple developers; for details of
|
||||
the authors, please see the code or manpages.
|
||||
|
||||
Also, many of these scripts have dependencies on other packages, but
|
||||
rather than burden the package with a large number of dependencies,
|
||||
most of which will not be needed by most people, the individual
|
||||
dependencies are listed as "Recommends" in the control file; lastly,
|
||||
scripts that are unlikely to be used by many people have their dependencies
|
||||
categorized as "Suggests" in the control file. This
|
||||
ensures that the packages will be installed by default but allows
|
||||
users to remove them if desired. The dependencies and recommendations
|
||||
are listed in square brackets in the description below, as well as in
|
||||
the Description field in the control file.
|
||||
The scripts marked with an asterisk ('*') are considered "core", and as
|
||||
such have their dependencies all listed as hard "Depends".
|
||||
|
||||
And now, in alphabetical order, the scripts:
|
||||
|
||||
- annotate-output: run a command and prepend time and stream (O for stdout,
|
||||
E for stderr) for every line of output.
|
||||
|
||||
- archpath: Prints arch (tla/Bazaar 1.x) package names. Also supports
|
||||
calculating the package names for other branches. [tla]
|
||||
|
||||
- bts: A command-line tool for accessing the Debian Bug Tracking System, both
|
||||
to send mails to control@bts.debian.org and to access the web pages and
|
||||
SOAP interface of the BTS. [www-browser, libauthen-sasl-perl,
|
||||
libsoap-lite-perl, liburi-perl, libwww-perl, bsd-mailx | mailx]
|
||||
|
||||
- build-rdeps: Searches for all packages that build-depend on a given
|
||||
package. [dctrl-tools, dose-extra, libdpkg-perl]
|
||||
|
||||
- chdist: tool to easily play with several distributions. [dctrl-tools]
|
||||
|
||||
- checkbashisms: check whether a /bin/sh script contains any common
|
||||
bash-specific constructs.
|
||||
|
||||
- cowpoke: upload a Debian source package to a cowbuilder host and build it,
|
||||
optionally also signing and uploading the result to an incoming queue.
|
||||
[ssh-client]
|
||||
|
||||
- dcmd: run a given command replacing the name of a .changes or .dsc file
|
||||
with each of the files referenced therein. *
|
||||
|
||||
- dd-list: given a list of packages, pretty-print it ordered by maintainer. *
|
||||
|
||||
- debbisect: bisect snapshot.debian.org to find which change in the archive
|
||||
introduced a certain problem. [debvm, mmdebstrap, python3-debian]
|
||||
|
||||
- debc: List contents of current package. Do this after a successful
|
||||
"debuild" to see if the package looks all right.
|
||||
|
||||
- debchange (abbreviation dch): Modifies debian/changelog and manages version
|
||||
numbers for you. It will either increment the version number or add an
|
||||
entry for the current version, depending upon the options given to it.
|
||||
[libdistro-info-perl, libsoap-lite-perl]*
|
||||
|
||||
- debcheckout: checkout the development repository of a Debian package. *
|
||||
|
||||
- debclean: Clean a Debian source tree. Debclean will clean all Debian
|
||||
source trees below the current directory, and if requested, also remove
|
||||
all files that were generated from these source trees (that is .deb, .dsc
|
||||
and .changes files). It will keep the .diffs and original files, though,
|
||||
so that the binaries and other files can be rebuilt if necessary. *
|
||||
|
||||
- debcommit: Commits changes to cvs, darcs, svn, svk, tla, bzr, git, or hg,
|
||||
using new entries in debian/changelog as the commit message. Also supports
|
||||
tagging Debian package releases. [cvs | darcs | subversion | svk | tla |
|
||||
bzr | git-core | mercurial, libtimedate-perl]
|
||||
|
||||
- debdiff: A program which examines two .deb files or two .changes files and
|
||||
reports on any difference found in their file lists. Useful for ensuring
|
||||
that no files were inadvertently lost between versions. Can also examine
|
||||
two .dsc files and report on the changes between source versions.
|
||||
For a deeper comparison one can use the diffoscope package.
|
||||
[wdiff, patchutils]*
|
||||
|
||||
- debdiff-apply: Apply unified diffs of two Debian source packages, such as
|
||||
those generated by debdiff, to a target Debian source package. Any changes
|
||||
to debian/changelog are dealt with specially, to avoid the conflicts that
|
||||
changelog diffs typically produce when applied naively. May be used to
|
||||
check that old patches still apply to newer versions of those packages.
|
||||
[python3-debian, python3-unidiff, quilt]
|
||||
|
||||
- debftbfs: list source packages which have FTBFS bugs filed against them
|
||||
and print them with the bug number and title.
|
||||
[postgresql-client, python3-debian, python3-debianbts]
|
||||
|
||||
- debi: Installs the current package by using dpkg. It assumes that the
|
||||
current package has just been built (for example by debuild), and the .deb
|
||||
lives in the parent directory, and will effectively run dpkg -i on the .deb.
|
||||
The ability to install the package with a very short command is very
|
||||
useful when troubleshooting packages.
|
||||
|
||||
- debootsnap: Combines debootstrap and snapshot.debian.org to create a chroot
|
||||
containing exactly the requested selection of packages. This can be used
|
||||
to re-create a chroot from the past, for example to reproduce a bug. The
|
||||
tool is also used by debrebuild to build a package in a chroot with build
|
||||
dependencies in the same version as recorded in the buildinfo file.
|
||||
[apt-utils, equivs, mmdebstrap, python3-debian, python3-pycurl,
|
||||
python3-requests]
|
||||
|
||||
- debrelease: A wrapper around dupload or dput which figures out which
|
||||
version to upload, and then calls dupload or dput to actually perform
|
||||
the upload. [dupload | dput, ssh-client]
|
||||
|
||||
- debrebuild: Given a buildinfo file, builds the referenced source package
|
||||
in an environment documented in the provided buildinfo file. The build
|
||||
can be performed by sbuild or other builders in a chroot environment created
|
||||
by debootsnap. The generated artifacts will be verified against the
|
||||
hashes from the buildinfo file.
|
||||
[sbuild | mmdebstrap, python3-pycurl, libdpkg-perl]
|
||||
|
||||
- debrepro: A script that tests reproducibility of Debian packages. It will
|
||||
build a given source directory twice, with a set of variation between the
|
||||
first and second build, and compare the binary packages produced. If
|
||||
diffoscope is installed, it is used to compare non-matching binaries. If
|
||||
disorderfs is installed, it is used during the build to inject
|
||||
non-determinism in filesystem listing operations.
|
||||
[faketime, diffoscope, disorderfs]
|
||||
|
||||
- debrsign: This transfers a .changes/.dsc pair to a remote machine for
|
||||
signing, and runs debsign on the remote machine over an SSH connection.
|
||||
[gnupg, debian-keyring, ssh-client]
|
||||
|
||||
- debsign: Use GNU Privacy Guard to sign the changes (and possibly dsc)
|
||||
files created by running dpkg-buildpackage with no-sign options. Useful
|
||||
if you are building a package on a remote machine and wish to sign it on
|
||||
a local one. This script is capable of automatically downloading the
|
||||
.changes and .dsc files from a remote machine. [gnupg, debian-keyring,
|
||||
ssh-client]*
|
||||
|
||||
- debsnap: grab packages from https://snapshot.debian.org [libwww-perl,
|
||||
libjson-perl]
|
||||
|
||||
- debuild: A wrapper for building a package (i.e., dpkg-buildpackage) to
|
||||
avoid problems with insufficient permissions and wrong paths etc.
|
||||
Debuild will set up the proper environment for building a package.
|
||||
Debuild will also run lintian to check that the package does not
|
||||
have any major policy violations. [lintian, gnupg]*
|
||||
|
||||
- deb-check-file-conflicts: Check (using apt-file) if a
|
||||
Debian package installs files in the exact same path as any other package, and
|
||||
if there are Breaks/Replaces (or Conflicts) defined in debian/control to avoid
|
||||
the package installation failing on file conflicts.
|
||||
|
||||
- deb-janitor: command-line client for interacting with the Debian Janitor.
|
||||
|
||||
- deb-reversion: increases a binary package version number and repacks the
|
||||
package, useful for porters and the like.
|
||||
|
||||
- deb-why-removed: shows the reason a package was removed from the archive.
|
||||
[libdpkg-perl]
|
||||
|
||||
- deb2apptainer: build a Singularity/Apptainer image with given Debian
|
||||
packages.
|
||||
|
||||
- deb2docker: build a docker image with given Debian packages. [docker.io]
|
||||
|
||||
- dep3changelog: generate a changelog entry from a DEP3-style patch header.
|
||||
|
||||
- dep-14-convert-git-branch-name: Convert git branches to follow DEP-14.
|
||||
|
||||
- dget: Downloads Debian source and binary packages. Point at a .changes or
|
||||
.dsc to download all references files. Specify a package name to download
|
||||
it from the configured apt repository. [wget | curl]
|
||||
|
||||
- diff2patches: extracts patches from a .diff.gz file placing them under
|
||||
debian/ or, if present, debian/patches. [patchutils]
|
||||
|
||||
- dpkg-depcheck, dpkg-genbuilddeps: Runs a specified command (such as
|
||||
debian/rules build) or dpkg-buildpackage, respectively, to determine the
|
||||
packages used during the build process. This information can be helpful
|
||||
when trying to determine the packages needed in the Build-Depends etc.
|
||||
lines in the debian/control file. [build-essential, strace]
|
||||
|
||||
- dscextract: extract a single file from a Debian source package.
|
||||
[patchutils]
|
||||
|
||||
- dscverify: check the signature and MD5 sums of a dsc file against the most
|
||||
current Debian keyring on your system. [gnupg, debian-keyring, debian-tag2upload-keyring]
|
||||
|
||||
- edit-patch: add/edit a patch for a source package and commit the changes.
|
||||
[quilt]
|
||||
|
||||
- getbuildlog: download package build logs from Debian auto-builders. [wget]
|
||||
|
||||
- git-deborig: try to produce Debian orig.tar using git-archive(1).
|
||||
[libdpkg-perl, libgit-wrapper-perl, liblist-compare-perl,
|
||||
libstring-shellquote-perl, libtry-tiny-perl]
|
||||
|
||||
- grep-excuses: grep britney's excuses to find out what is happening to your
|
||||
packages. [libdbd-pg-perl, libterm-size-perl, libyaml-libyaml-perl, wget, w3m]
|
||||
|
||||
- hardening-check: report the hardening characteristics of a set of binaries.
|
||||
|
||||
- list-unreleased: searches for packages marked UNRELEASED in their
|
||||
changelog.
|
||||
|
||||
- ltnu (Long Time No Upload): List all uploads of packages by the
|
||||
given uploader or maintainer and display them ordered by the last
|
||||
upload of that package, oldest uploads first.
|
||||
|
||||
- manpage-alert: locate binaries without corresponding manpages. [man-db]
|
||||
|
||||
- mass-bug: mass-file bug reports. [bsd-mailx | mailx]
|
||||
|
||||
- mergechanges: merge .changes files from the same release but built
|
||||
on different architectures.
|
||||
|
||||
- mk-build-deps: Given a package name and/or control file, generate a binary
|
||||
package which may be installed to satisfy the build-dependencies of the
|
||||
given package. [equivs]
|
||||
|
||||
- mk-origtargz: Rename upstream tarball, optionally changing the compression
|
||||
and removing unwanted files.
|
||||
[libfile-which-perl, unzip, xz-utils, file]
|
||||
|
||||
- namecheck: Check project names are not already taken.
|
||||
|
||||
- nmudiff: prepare a diff of this version (presumably an NMU against the
|
||||
previously released version (as per the changelog) and submit the diff
|
||||
to the BTS. [patchutils, mutt]
|
||||
|
||||
- origtargz: fetch the orig tarball of a Debian package from various sources,
|
||||
and unpack it. [pristine-tar, pristine-lfs]
|
||||
|
||||
- plotchangelog: display information from a changelog graphically using
|
||||
gnuplot. [libtimedate-perl, gnuplot]
|
||||
|
||||
- pts-subscribe: subscribe to the PTS (Package Tracking System) for a
|
||||
limited period of time. [bsd-mailx | mailx, at]
|
||||
|
||||
- rc-alert: list installed packages which have release-critical bugs.
|
||||
[wget | curl]
|
||||
|
||||
- reproducible-check: reports on the reproducible status of installed
|
||||
packages. For more details please see <https://reproducible-builds.org>.
|
||||
|
||||
- rmadison: remotely query the Debian archive database about packages.
|
||||
[liburi-perl, wget | curl]
|
||||
|
||||
- sadt: run DEP-8 tests. [python3-debian, autodep8]
|
||||
|
||||
- salsa: manipulates salsa.debian.org repositories and users
|
||||
[libgitlab-api-v4-perl]
|
||||
|
||||
- suspicious-source: output a list of files which are not common source
|
||||
files. [python3-magic]
|
||||
|
||||
- svnpath: Prints the path to the Subversion repository of a Subversion
|
||||
checkout. Also supports calculating the paths for branches and
|
||||
tags in a repository independent fashion. Used by debcommit to generate
|
||||
svn tags. [subversion]
|
||||
|
||||
- tagpending: runs from a Debian source tree and tags bugs that are to be
|
||||
closed in the latest changelog as pending. [libsoap-lite-perl]
|
||||
|
||||
- transition-check: Check a list of source packages for involvement in
|
||||
transitions for which uploads to unstable are currently blocked.
|
||||
[libwww-perl, libyaml-libyaml-perl]
|
||||
|
||||
- uscan: Automatically scan for and download upstream updates. Uscan can
|
||||
also call a program such as uupdate to attempt to update the Debianised
|
||||
version based on the new update. Whilst uscan could be used to release
|
||||
the updated version automatically, it is probably better not to without
|
||||
testing it first. Uscan can also verify detached OpenPGP signatures if
|
||||
upstream's signing key is known. [file, sopv | gpgv,
|
||||
libfile-dirlist-perl, libfile-touch-perl, libfile-which-perl,
|
||||
liblwp-protocol-https-perl, libmoo-perl, libwww-perl, unzip, xz-utils]*
|
||||
|
||||
- uupdate: Update the package with an archive or patches from
|
||||
an upstream author. This will be of help if you have to update your
|
||||
package. It will try to apply the latest diffs to your package and
|
||||
tell you how successful it was. [patch]
|
||||
|
||||
- what-patch: determine what patch system, if any, a source package is using.
|
||||
[patchutils]
|
||||
|
||||
- whodepends: check which maintainers' packages depend on a package.
|
||||
|
||||
- who-permits-upload: Retrieve information about Debian Maintainer access
|
||||
control lists. [gnupg, libencode-locale-perl, libwww-perl, debian-keyring]
|
||||
|
||||
- who-uploads: determine the most recent uploaders of a package to the Debian
|
||||
archive. [gnupg, debian-keyring, debian-maintainers, wget]
|
||||
|
||||
- wnpp-alert: list installed packages which are orphaned or up for adoption.
|
||||
[wget | curl]
|
||||
|
||||
- wnpp-check: check whether there is an open request for packaging or
|
||||
intention to package bug for a package. [wget | curl]
|
||||
|
||||
- wrap-and-sort: wrap long lines and sort items in packaging files.
|
||||
[python3-debian]
|
||||
|
||||
- /usr/share/doc/devscripts/examples: This directory contains an example
|
||||
exim script for sorting mail arriving to Debian mailing lists.
|
||||
|
||||
Typical Maintenance cycle with devscripts
|
||||
-----------------------------------------
|
||||
|
||||
1. cd <source directory of package>
|
||||
|
||||
2. Editing of files
|
||||
|
||||
3. Log the changes with: dch -i "I changed this"
|
||||
If desired, use debcommit to commit changes to cvs, svn, arch or git.
|
||||
|
||||
4. Run debuild to compile it. If it fails, return to 2. (You could
|
||||
also just test the compilation by running the appropriate part of
|
||||
debian/rules.)
|
||||
|
||||
5. Check if package contents appear to be ok with "debc"
|
||||
|
||||
6. Install the package with "debi" and test the functionality it
|
||||
should provide. (Note that this step requires debpkg to be setuid
|
||||
root, or you to be logged in as root or similar.)
|
||||
|
||||
7. If all is ok release it by running debrelease.
|
||||
|
||||
8. Optionally, use debcommit --release to commit and tag the release
|
||||
in revision control.
|
||||
|
||||
|
||||
Originally by Christoph Lameter <clameter@waterf.org>
|
||||
Modified extensively by Julian Gilbey <jdg@debian.org>
|
19
README.newscripts
Normal file
19
README.newscripts
Normal file
|
@ -0,0 +1,19 @@
|
|||
List of things to do when adding a new script to devscripts package:
|
||||
|
||||
1. Add the script under scripts/
|
||||
- the script should have .sh or .pl extension, otherwise have a look at
|
||||
scripts/Makefile and patch it
|
||||
- if the script is perl and uses embedded POD for documentation, add an
|
||||
entry to the GEN_MAN1S variable in Makefile
|
||||
2. Add an entry in README.md
|
||||
3. Add an entry in debian/control
|
||||
4. Add an entry in po4a/devscripts-po4a.conf
|
||||
5. Add any necessary entries to the Suggests: and Recommends: lines in
|
||||
debian/control
|
||||
6. Modify conf.default.in and debian/postinst if necessary
|
||||
7. Modify debian/copyright if necessary
|
||||
8. Add entries in .gitignore
|
||||
9. Add a changelog entry
|
||||
10. Add an entry in https://wiki.debian.org/Devscripts/bugs (and send to the BTS)
|
||||
11. For Python scripts, add the name to the SCRIPTS array in
|
||||
scripts/devscripts/test/__init__.py
|
623
conf.default.in
Normal file
623
conf.default.in
Normal file
|
@ -0,0 +1,623 @@
|
|||
# This configuration file gives defaults for the scripts in
|
||||
# the devscripts package, as documented in the individual manpages.
|
||||
# Variables defined here may be overridden by a per-user ~/.devscripts
|
||||
# configuration file, which has exactly the same syntax as this file.
|
||||
#
|
||||
# This file is sourced by /bin/bash, and should only contain
|
||||
# comment lines (beginning with a '#'), and lines of the form
|
||||
# VARIABLE=value
|
||||
# The value must be quoted if there are spaces in it.
|
||||
# Variables corresponding to switches (on/off; yes/no) must take
|
||||
# one of the values 'yes' or 'no'.
|
||||
# The variable names are all of the form PROGNAME_VARNAME,
|
||||
# or DEVSCRIPTS_VARNAME if they are more generally applicable.
|
||||
#
|
||||
# As new variables are introduced into the devscripts program, their
|
||||
# descriptions and default values will be appended as comments
|
||||
# to this file.
|
||||
|
||||
# Variables recognised as of devscripts version ###VERSION###:
|
||||
|
||||
##### Package-wide variables
|
||||
#
|
||||
# Lists of which scripts are affected by these package-wide variables
|
||||
# can be found in the devscripts.conf(5) manpage.
|
||||
#
|
||||
#
|
||||
# Directory Name Checking
|
||||
#
|
||||
# Several programs check the directory name and refuse to function if
|
||||
# it does not match the name of the package being worked on. (The
|
||||
# details are described in the individual manpages.)
|
||||
# These two variables control this behaviour, corresponding to the
|
||||
# --check-dirname-level and --check-dirname-regex command line options.
|
||||
# The possible values of DEVSCRIPTS_CHECK_DIRNAME_LEVEL are:
|
||||
# 0 never check the directory name
|
||||
# 1 check the directory name only if the program has changed directory
|
||||
# 2 always check the directory name
|
||||
# The variable DEVSCRIPTS_DIRNAME_REGEXP is a Perl regex which
|
||||
# defines what is considered a valid directory name for the source
|
||||
# package PACKAGE; if it includes a '/', then it must match the full
|
||||
# directory path, otherwise it must match the full directory name.
|
||||
#
|
||||
# The default settings are:
|
||||
# DEVSCRIPTS_CHECK_DIRNAME_LEVEL=1
|
||||
# DEVSCRIPTS_CHECK_DIRNAME_REGEX='PACKAGE(-.+)?'
|
||||
|
||||
##### annotate-output
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### archpath
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### bts
|
||||
#
|
||||
# Default bts show/bugs to run in offline mode?
|
||||
# BTS_OFFLINE=no
|
||||
#
|
||||
# Cache all visited bug reports once a cache has been established
|
||||
# for the first time?
|
||||
# BTS_CACHE=yes
|
||||
#
|
||||
# How much to mirror when caching? The minimal amount (min), the mbox
|
||||
# version as well (mbox) or the whole works (full)?
|
||||
# BTS_CACHE_MODE=min
|
||||
#
|
||||
# Always refresh the cache, even if nothing's changed?
|
||||
# BTS_FORCE_REFRESH=no
|
||||
#
|
||||
# How do we read an mbox? This will be split on whitespace, then
|
||||
# %s is replaced by the mbox name and %% by a single %.
|
||||
# BTS_MAIL_READER='mutt -f %s'
|
||||
#
|
||||
# What sendmail command do we use? This will be split on whitespace.
|
||||
# BTS_SENDMAIL_COMMAND='/usr/sbin/sendmail'
|
||||
#
|
||||
# Download only new bugs when caching? If set to yes, don't check for
|
||||
# updates in bugs we already have.
|
||||
# BTS_ONLY_NEW=no
|
||||
#
|
||||
# Which SMTP host should be used? Note that if both an SMTP host and
|
||||
# sendmail command are specified in the configuration file(s), the SMTP
|
||||
# host will be used unless overridden by --sendmail on the command line
|
||||
# BTS_SMTP_HOST=reportbug.debian.org
|
||||
#
|
||||
# If the SMTP host specified above requires authentication, the following
|
||||
# options may be used to specify the username and password to use.
|
||||
# If only a username is provided then the password will be prompted for
|
||||
# before sending the e-mail
|
||||
# BTS_SMTP_AUTH_USERNAME=user
|
||||
# BTS_SMTP_AUTH_PASSWORD=pass
|
||||
#
|
||||
# Specify a HELO to use when connecting to the SMTP host. If not supplied
|
||||
# and the file /etc/mailname exists, its contents will be used as the HELO
|
||||
# BTS_SMTP_HELO=foo.example.com
|
||||
#
|
||||
# Include resolved bugs when caching?
|
||||
# BTS_INCLUDE_RESOLVED=yes
|
||||
#
|
||||
# Suppress BTS acknowledgment e-mails (ignored by the control bot)
|
||||
# BTS_SUPPRESS_ACKS=no
|
||||
#
|
||||
# Allow the generated message to be edited and, if necessary, abandoned
|
||||
# before sending it to the control bot?
|
||||
#
|
||||
# If set to yes, prompt for confirmation / edit / abandonment.
|
||||
# If set to force, spawn an editor and then proceed as if set to yes
|
||||
# BTS_INTERACTIVE=no
|
||||
#
|
||||
# Specify a list of e-mail addresses to which a carbon copy of the
|
||||
# generated e-mail to the control bot should automatically be sent.
|
||||
# BTS_DEFAULT_CC=example@example.com
|
||||
#
|
||||
# Which debbugs server should be used?
|
||||
# BTS_SERVER=https://bugs.debian.org
|
||||
|
||||
##### chdist
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### checkbashisms
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### cowpoke
|
||||
#
|
||||
# No variables currently; see cowpoke.conf and cowpoke(1)
|
||||
|
||||
##### dd-list
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### dcmd
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### deb2apptainer
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### deb2docker
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### debc
|
||||
#
|
||||
# debc recognises the DEBRELEASE_DEBS_DIR variable; see debrelease
|
||||
# below for more information.
|
||||
|
||||
##### deb-reversion
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### debchange/dch
|
||||
#
|
||||
# Preserve the source tree dirname if the upstream version changes?
|
||||
# DEBCHANGE_PRESERVE=no
|
||||
#
|
||||
# Query the BTS when --closes is being used?
|
||||
# DEBCHANGE_QUERY_BTS=yes
|
||||
#
|
||||
# Select a heuristic to use to determine whether the package has released.
|
||||
# See the debchange man page for details.
|
||||
# DEBCHANGE_RELEASE_HEURISTIC=log
|
||||
# DEBCHANGE_RELEASE_HEURISTIC=changelog
|
||||
#
|
||||
# Introduce multiple-maintainer markers in changelog sections?
|
||||
# DEBCHANGE_MULTIMAINT=yes
|
||||
#
|
||||
# When appending to a multiple-maintainer changelog, if there are
|
||||
# existing changes made by the current maintainer, should new
|
||||
# changelog entries be appended to the existing entries?
|
||||
# DEBCHANGE_MULTIMAINT_MERGE=yes
|
||||
#
|
||||
# When appending entries to the changelog, should the trailer line
|
||||
# be maintained as-is?
|
||||
# DEBCHANGE_MAINTTRAILER=yes
|
||||
#
|
||||
# Use a fixed timezone in changelog entries?
|
||||
# DEBCHANGE_TZ=UTC
|
||||
#
|
||||
# Allow a new version to be lower than the current package version
|
||||
# if the new version matches the specified regular expression
|
||||
# DEBCHANGE_LOWER_VERSION_PATTERN=bpo
|
||||
#
|
||||
# Attempt to automatically determine whether the current changelog
|
||||
# stanza represents an NMU?
|
||||
# DEBCHANGE_AUTO_NMU=yes
|
||||
#
|
||||
# When --release was used and an editor presented, force the changelog
|
||||
# to be explicitly saved in the editor? If this is set to "no" then
|
||||
# the changes made by --release will be automatically saved.
|
||||
# DEBCHANGE_FORCE_SAVE_ON_RELEASE=yes
|
||||
|
||||
##### debcheckout
|
||||
#
|
||||
# List of space-separated pairs REGEXP/REPLACEMENT_TEXT to define
|
||||
# custom rules to enable authenticated mode.
|
||||
# DEBCHECKOUT_AUTH_URLS=''
|
||||
#
|
||||
# For debian-dir-only repositories, also retrieve the source
|
||||
# package, unpack it, and move the missing files over.
|
||||
# DEBCHECKOUT_SOURCE=auto
|
||||
#
|
||||
# Username for authenticated mode, can be overridden with -u|--user.
|
||||
# DEBCHECKOUT_USER=''
|
||||
#
|
||||
# See debcheckout(1) for a more precise description of syntax and
|
||||
# semantics of these settings.
|
||||
|
||||
##### debclean
|
||||
#
|
||||
# Remove .deb, .changes, .dsc and .upload files?
|
||||
# DEBCLEAN_CLEANDEBS=no
|
||||
|
||||
##### debcommit
|
||||
#
|
||||
# Strip a leading "* " from commit messages taken from changelogs?
|
||||
# DEBCOMMIT_STRIP_MESSAGE=yes
|
||||
#
|
||||
# Sign created tags using OpenPGP?
|
||||
# DEBCOMMIT_SIGN_TAGS=no
|
||||
#
|
||||
# Take any uncommitted changes in the changelog in
|
||||
# to account when determining the commit message
|
||||
# for a release?
|
||||
# DEBCOMMIT_RELEASE_USE_CHANGELOG=no
|
||||
#
|
||||
# Sign commits using OpenPGP?
|
||||
# DEBCOMMIT_SIGN_COMMITS=no
|
||||
|
||||
##### debdiff
|
||||
#
|
||||
# Show directory names which appear in the filelist?
|
||||
# DEBDIFF_DIRS=no
|
||||
#
|
||||
# Compare control files?
|
||||
# DEBDIFF_CONTROL=yes
|
||||
#
|
||||
# Which control files to compare? A comma-separated list, with
|
||||
# possibilities such as postinst, config and so on; ALL means compare
|
||||
# all control files.
|
||||
# DEBDIFF_CONTROLFILES=control
|
||||
#
|
||||
# Show files which have moved between .debs?
|
||||
# DEBDIFF_SHOW_MOVED=no
|
||||
#
|
||||
# Option to pass to wdiff
|
||||
# DEBDIFF_WDIFF_OPT=
|
||||
#
|
||||
# Include the output of diffstat?
|
||||
# DEBDIFF_SHOW_DIFFSTAT=no
|
||||
#
|
||||
# Compare control files in source packages using wdiff?
|
||||
# DEBDIFF_WDIFF_SOURCE_CONTROL=no
|
||||
#
|
||||
# Always compare package in version order, rather than the order specified
|
||||
# on the command line?
|
||||
# DEBDIFF_AUTO_VER_SORT=no
|
||||
#
|
||||
# Unpack tarballs found in the top level source directory.
|
||||
# DEBDIFF_UNPACK_TARBALLS=yes
|
||||
#
|
||||
# Apply patches when comparing 3.0 (quilt)-format packages
|
||||
# DEBDIFF_APPLY_PATCHES=no
|
||||
|
||||
##### debdiff-apply
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### debi
|
||||
#
|
||||
# debc recognises the DEBRELEASE_DEBS_DIR variable; see debrelease
|
||||
# below for more information.
|
||||
|
||||
##### debpkg
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### debrelease
|
||||
#
|
||||
# This specifies which uploader program to use. As of devscripts ###VERSION###
|
||||
# the recognised values are "dupload" (default) and "dput". Check the
|
||||
# debrelease(1) manpage for any recent changes to this variable
|
||||
# DEBRELEASE_UPLOADER=dupload
|
||||
#
|
||||
# This specifies the directory, relative to the top of the source
|
||||
# tree, in which the .changes and .debs files are to be found. Note
|
||||
# that this also affects debc and debi.
|
||||
# DEBRELEASE_DEBS_DIR=..
|
||||
|
||||
##### debrsign
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### debsign
|
||||
#
|
||||
# debsign recognises the DEBRELEASE_DEBS_DIR variable; see debrelease
|
||||
# above for more information.
|
||||
#
|
||||
# Always re-sign files even if they are already signed, without prompting.
|
||||
# DEBSIGN_ALWAYS_RESIGN=yes
|
||||
#
|
||||
# Which signing program to use? gpg and pgp are the usual values; the
|
||||
# default is determined as described in the manpage.
|
||||
# Corresponds to -p option
|
||||
# DEBSIGN_PROGRAM=
|
||||
#
|
||||
# How the signing program works; must be either gpg or pgp as of
|
||||
# devscripts version ###VERSION###. The default is described in the
|
||||
# manpage. Corresponds to -sgpg and -spgp.
|
||||
# DEBSIGN_SIGNLIKE=
|
||||
#
|
||||
# Maintainer name (only used to determine OpenPGP key ID; -m option)
|
||||
# DEBSIGN_MAINT=
|
||||
#
|
||||
# OpenPGP key ID to use (-k option)
|
||||
# DEBSIGN_KEYID=
|
||||
|
||||
##### debsnap
|
||||
#
|
||||
# Where to put the directory named <prefix>-<package>/
|
||||
# default: source-$package_name if unset
|
||||
# DEBSNAP_DESTDIR=
|
||||
#
|
||||
# Verbosely show messages (yes/no)
|
||||
# default: no
|
||||
# DEBSNAP_VERBOSE=no
|
||||
#
|
||||
# The base URL of the archive to download from
|
||||
# DEBSNAP_BASE_URL=https://snapshot.debian.org
|
||||
#
|
||||
# A sed regexp to transform pool/<component>/f/foo into the desired layout
|
||||
# default: make the directory from pool/<component>/f/foo to pool/f/foo
|
||||
# DEBSNAP_CLEAN_REGEX="s@\([^/]*\)/[^/]*/\(.*\)@\1/\2@"
|
||||
#
|
||||
# Where the Sources.gz lives, subdirectory of DEBSNAP_BASE_URL/<clean dir>/
|
||||
# default: DEBSNAP_BASE_URL/<clean dir>/source/Sources.gz
|
||||
# DEBSNAP_SOURCES_GZ_PATH=source/Sources.gz
|
||||
|
||||
##### debuild
|
||||
#
|
||||
# Do we preserve the whole environment except for PATH?
|
||||
# DEBUILD_PRESERVE_ENV=no
|
||||
#
|
||||
# Are there any environment variables we should preserve? This should
|
||||
# be a comma-separated list.
|
||||
# DEBUILD_PRESERVE_ENVVARS=""
|
||||
#
|
||||
# How to set a preserved environment variable, in this case to set
|
||||
# FOO=bar.
|
||||
# DEBUILD_SET_ENVVAR_FOO=bar
|
||||
#
|
||||
# Do we check for the existence of the .orig.tar.gz before calling
|
||||
# dpkg-buildpackage?
|
||||
# DEBUILD_TGZ_CHECK=yes
|
||||
#
|
||||
# Corresponds to the dpkg-buildpackage -r option.
|
||||
# DEBUILD_ROOTCMD=fakeroot
|
||||
#
|
||||
# Extra options given to dpkg-buildpackage before any command-line
|
||||
# options specified. Single options containing spaces should be
|
||||
# quoted, for example "-m'Julian Gilbey <jdg@debian.org>' -us -uc"
|
||||
# If this contains a -r, -d or -D option, this will also be recognised
|
||||
# when running debuild binary|binary-arch|...
|
||||
# DEBUILD_DPKG_BUILDPACKAGE_OPTS=""
|
||||
#
|
||||
# Do we run lintian at the end of a full run?
|
||||
# DEBUILD_LINTIAN=yes
|
||||
#
|
||||
# Extra options given to lintian before any command-line options
|
||||
# specified.
|
||||
# DEBUILD_LINTIAN_OPTS=""
|
||||
#
|
||||
# Colon-separated list of options to be added to the beginning
|
||||
# of PATH once it has been sanitised
|
||||
# DEBUILD_PREPEND_PATH="/usr/lib/ccache"
|
||||
#
|
||||
# Credentials to pass to debrsign when signing dsc / changes files
|
||||
# Setting this option to a non-blank string implies using debrsign
|
||||
# DEBUILD_SIGNING_USERNAME="user@host"
|
||||
#
|
||||
# Hooks; see the manpage for details of these
|
||||
# DEBUILD_DPKG_BUILDPACKAGE_HOOK=""
|
||||
# DEBUILD_CLEAN_HOOK=""
|
||||
# DEBUILD_DPKG_SOURCE_HOOK=""
|
||||
# DEBUILD_BUILD_HOOK=""
|
||||
# DEBUILD_BINARY_HOOK=""
|
||||
# DEBUILD_FINAL_CLEAN_HOOK=""
|
||||
# DEBUILD_LINTIAN_HOOK=""
|
||||
# DEBUILD_SIGNING_HOOK=""
|
||||
# DEBUILD_POST_DPKG_BUILDPACKAGE_HOOK=""
|
||||
|
||||
##### dget
|
||||
#
|
||||
# Extra directories to search for files in addition to
|
||||
# /var/cache/apt/archives. This is a colon-separated list of directories.
|
||||
# DGET_PATH=""
|
||||
#
|
||||
# Unpack downloaded source packages
|
||||
# DGET_UNPACK=yes
|
||||
#
|
||||
# Verify source package signatures using dscverify
|
||||
# DGET_VERIFY=yes
|
||||
|
||||
##### diff2patches
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### dpkg-depcheck
|
||||
#
|
||||
# Extra options given to dpkg-depcheck before any command-line
|
||||
# options specified. For example: "-b --features=-catch-alternatives"
|
||||
# DPKG_DEPCHECK_OPTIONS=""
|
||||
|
||||
##### dpkg-genbuilddeps
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### dpkg-sig
|
||||
#
|
||||
# dpkg-sig is not a part of devscripts, but shares this configuration file.
|
||||
# It pays attention to the values of DEBSIGN_MAINT and DEBSIGN_KEY in
|
||||
# addition to the following.
|
||||
#
|
||||
# This OpenPGP key ID takes precedence over the rest
|
||||
# DPKGSIG_KEYID=
|
||||
#
|
||||
# Do we sign the .changes and .dsc files? See the manpage for more
|
||||
# info. Valid options are no, auto, yes, full and force_full.
|
||||
# DPKGSIG_SIGN_CHANGES=auto
|
||||
#
|
||||
# Do we cache the OpenPGP passphrase by default? This can be dangerous!
|
||||
# DPKGSIG_CACHE_PASS=no
|
||||
|
||||
##### dscverify
|
||||
#
|
||||
# A colon separated list of extra keyrings to read.
|
||||
# DSCVERIFY_KEYRINGS=""
|
||||
|
||||
##### getbuildlog
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### grep-excuses
|
||||
#
|
||||
# This specifies a default maintainer name or email to hunt for
|
||||
# GREP_EXCUSES_MAINTAINER=""
|
||||
#
|
||||
# Is this running on ftp-master.debian.org? If so, we use the local
|
||||
# excuses file
|
||||
# GREP_EXCUSES_FTP_MASTER=no
|
||||
|
||||
##### list-unreleased
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### mergechanges
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### manpage-alert
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### mass-bug
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
#### mk-build-deps
|
||||
#
|
||||
# Which tool to use for installing build depends?
|
||||
# MKBUILDDEPS_TOOL="/usr/bin/apt-get --no-install-recommends"
|
||||
#
|
||||
# Remove package files after install?
|
||||
# MKBUILDDEPS_REMOVE_AFTER_INSTALL=yes
|
||||
#
|
||||
# Tool used to gain root privileges to install the deb
|
||||
# MKBUILDDEPS_ROOTCMD=''
|
||||
|
||||
##### namecheck
|
||||
#
|
||||
# No variables currently; see .namecheckrc
|
||||
|
||||
##### nmudiff
|
||||
#
|
||||
# Number of days to indicate that an NMU upload has been delayed by
|
||||
# using the DELAYED upload queue. 0 indicates no delay.
|
||||
# Defaults to "XX" which adds a placeholder to the e-mail.
|
||||
# NMUDIFF_DELAY=3
|
||||
#
|
||||
# Should we use mutt to edit and send the message or just a plain old
|
||||
# editor?
|
||||
# NMUDIFF_MUTT=yes
|
||||
#
|
||||
# Should we always submit a new report (yes), always send to the bugs
|
||||
# which are being closed (no), or send to the bug being closed if
|
||||
# there is only one of them, otherwise send a new report (maybe)?
|
||||
# NMUDIFF_NEWREPORT=maybe
|
||||
#
|
||||
# nmudiff also uses the value of BTS_SENDMAIL_COMMAND if NMUDIFF_MUTT=no
|
||||
|
||||
##### plotchangelog
|
||||
#
|
||||
# Command line options to use (space separated). None of the options
|
||||
# should contain spaces. Use the PLOTCHANGELOG_GNUPLOT variable for
|
||||
# the --gnuplot command line option.
|
||||
# PLOTCHANGELOG_OPTIONS=""
|
||||
#
|
||||
# Here we can give gnuplot options. Any command line --gnuplot
|
||||
# commands will be appended to these.
|
||||
# PLOTCHANGELOG_GNUPLOT=""
|
||||
|
||||
##### pts-subscribe
|
||||
#
|
||||
# How long will we subscribe for by default? The default is 30 days.
|
||||
# Setting this to 'forever' means that no unsubscription request will
|
||||
# be scheduled.
|
||||
# PTS_UNTIL='now + 30 days'
|
||||
|
||||
##### rc-alert
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### rmadison
|
||||
#
|
||||
# Add a custom URL to the default list of shorthands so one
|
||||
# can use it with -u without having to specify the full URL
|
||||
#
|
||||
# RMADISON_URL_MAP_EXAMPLE=http://example.com/madison.cgi
|
||||
#
|
||||
# Default URL to use if none is specified on the command line.
|
||||
# RMADISON_DEFAULT_URL=debian
|
||||
#
|
||||
# Default architecture to use if none is specified on the command line.
|
||||
# use --architecture='*' to run an unrestricted query when
|
||||
# RMADISON_ARCHITECTURE is set.
|
||||
# RMADISON_ARCHITECTURE=source,i386,amd64,all
|
||||
|
||||
##### svnpath
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### tagpending
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### transition-check
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### uscan
|
||||
#
|
||||
# Should we download newer upstream files we come across?
|
||||
# USCAN_DOWNLOAD=yes
|
||||
#
|
||||
# Should we use FTP PASV mode for ftp:// links? 'default' means let
|
||||
# Net::FTP(3) make the choice (primarily based on the FTP_PASSIVE
|
||||
# environment variable); 'yes' and 'no' override the default
|
||||
# USCAN_PASV=default
|
||||
#
|
||||
# Should we create a symlink from the downloaded tar.gz file to
|
||||
# pkg_version.orig.tar.gz, rename it like this or do nothing?
|
||||
# Options are 'symlink'/'yes', 'rename' or 'no'
|
||||
# USCAN_SYMLINK=yes
|
||||
#
|
||||
# Should we use DEHS style output (XML format)?
|
||||
# USCAN_DEHS_OUTPUT=no
|
||||
#
|
||||
# Should we give verbose output?
|
||||
# USCAN_VERBOSE=no
|
||||
#
|
||||
# What user agent string should we send with requests?
|
||||
# (Default is 'Debian uscan X.Y.Z')
|
||||
# USCAN_USER_AGENT=''
|
||||
#
|
||||
# Where should downloaded files be placed?
|
||||
# USCAN_DESTDIR=..
|
||||
#
|
||||
# Automatically repack bzipped tar or zip archives to gzipped tars?
|
||||
# USCAN_REPACK=no
|
||||
#
|
||||
# Use the Files-Excluded field in debian/copyright to determine whether
|
||||
# the orig tarball needs to be repacked to remove non-DFSG content?
|
||||
# USCAN_EXCLUSION=yes
|
||||
|
||||
##### uupdate
|
||||
#
|
||||
# Should we retain the pristine upstream source wherever possible?
|
||||
# UUPDATE_PRISTINE=yes
|
||||
#
|
||||
# Should we symlink the .orig.tar.gz file to its new name or
|
||||
# copy it instead? yes=symlink, no=copy
|
||||
# UUPDATE_SYMLINK_ORIG=yes
|
||||
#
|
||||
# Corresponds to the dpkg-buildpackage -r option and debuild
|
||||
# DEBUILD_ROOTCMD option. Normally, this can be left empty, as then
|
||||
# the debuild setting will be used.
|
||||
# UUPDATE_ROOTCMD=''
|
||||
|
||||
##### whodepends
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### who-uploads
|
||||
#
|
||||
# Display the date of the upload?
|
||||
# WHOUPLOADS_DATE=no
|
||||
#
|
||||
# Maximum number of uploads to display per package
|
||||
# WHOUPLOADS_MAXUPLOADS=3
|
||||
#
|
||||
# Colon-separated list of keyrings to examine by default
|
||||
# WHOUPLOADS_KEYRINGS=/usr/share/keyrings/debian-keyring.gpg:/usr/share/keyrings/debian-keyring.pgp:/usr/share/keyrings/debian-maintainers.gpg:/usr/share/keyrings/debian-nonupload.gpg
|
||||
|
||||
##### wnpp-alert
|
||||
#
|
||||
# No variables currently
|
||||
|
||||
##### wnpp-check
|
||||
#
|
||||
# No variables currently
|
133
cowpoke.conf
Normal file
133
cowpoke.conf
Normal file
|
@ -0,0 +1,133 @@
|
|||
# System configuration file for cowpoke
|
||||
# This file is sourced as a bash shell script, see cowpoke(1) for more details.
|
||||
|
||||
# Global defaults
|
||||
# These apply to every arch and dist in a single cowpoke invocation.
|
||||
# ------------------------------------------------------------------
|
||||
|
||||
# The hostname of the machine where cowbuilder is installed
|
||||
# eg. BUILDD_HOST="buildd.your.org"
|
||||
BUILDD_HOST=
|
||||
|
||||
# The username for unprivileged operations on BUILDD_HOST
|
||||
# If unset the user that invoked cowpoke will be assumed, or the user that
|
||||
# is configured for the BUILDD_HOST in your ssh config will be used.
|
||||
#BUILDD_USER=
|
||||
|
||||
# The Debian architecture(s) to build for. A space separated list of
|
||||
# architectures may be used here to build for all of them in a single pass.
|
||||
#BUILDD_ARCH="$(dpkg-architecture -qDEB_BUILD_ARCH 2>/dev/null)"
|
||||
|
||||
# The Debian distro to build for. A space separated list of distros may be
|
||||
# used here to build for all of them in a single pass.
|
||||
#BUILDD_DIST="unstable"
|
||||
|
||||
# The directory (under BUILDD_USER's home if relative) to upload packages
|
||||
# for building and where build logs and the result of post-build checks will
|
||||
# be placed
|
||||
#INCOMING_DIR="cowbuilder-incoming"
|
||||
|
||||
# The filesystem root for all pbuilder COW and result files. Arch and dist
|
||||
# specific subdirectories normally will be created under this. The apt cache
|
||||
# and temporary build directory will also be located under this path.
|
||||
#PBUILDER_BASE="/var/cache/pbuilder"
|
||||
|
||||
# The OpenPGP key ID to pass to debsign's -k option. eg. SIGN_KEYID="0x12345678"
|
||||
# Leave this unset if you do not wish to sign packages built in this way.
|
||||
#SIGN_KEYID=
|
||||
|
||||
# The 'host' alias to pass to dput. eg. UPLOAD_QUEUE="ftp-master"
|
||||
# Leave this unset if you do not wish to upload packages built this way.
|
||||
# This option will be ignored if SIGN_KEYID is unset.
|
||||
#UPLOAD_QUEUE=
|
||||
|
||||
# The command to use to gain root privileges on the remote build machine.
|
||||
# This is only required to invoke cowbuilder and allow it to enter its chroot,
|
||||
# so you may restrict this user to only being able to run that command with
|
||||
# escalated privileges. Something like this in sudoers will enable invoking
|
||||
# cowbuilder without an additional password entry required:
|
||||
# youruser ALL = NOPASSWD: /usr/sbin/cowbuilder
|
||||
# Alternatively you could use ssh with a forwarded key, or whatever other
|
||||
# mechanism suits your local access policy. su -c isn't really suitable
|
||||
# here due to its quoting requirements being different from all the rest.
|
||||
#BUILDD_ROOTCMD="sudo"
|
||||
|
||||
# The utility to use when creating a new build root. Alternatives are
|
||||
# debootstrap or cdebootstrap.
|
||||
#DEBOOTSTRAP="cdebootstrap"
|
||||
|
||||
# If set, package files resulting from the build will be copied to the path
|
||||
# (local or remote) that this is set to, after the build completes. It is
|
||||
# unset by default and can be overridden with --return or --no-return.
|
||||
# The given path must exist, it will not be created.
|
||||
#RETURN_DIR="."
|
||||
|
||||
# =============================================================================
|
||||
#
|
||||
# Arch and dist specific options
|
||||
# These are variables of the form: $arch_$dist_VAR, which apply only for a
|
||||
# particular target arch/dist build. The following variables are supported:
|
||||
#
|
||||
# $arch_$dist_RESULT_DIR - The directory where pbuilder/cowbuilder will place
|
||||
# the built package, and where any previously built
|
||||
# packages may be found for comparison using debdiff
|
||||
# after building.
|
||||
#
|
||||
# $arch_$dist_BASE_PATH - The directory where the COW master files are found.
|
||||
#
|
||||
# $arch_$dist_BASE_DIST - The code name to pass as the --distribution option
|
||||
# for cowbuilder instead of $dist. This is necessary
|
||||
# when $dist is a locally significant name assigned
|
||||
# to some specially configured build chroot, such as
|
||||
# 'wheezy_backports', and not the formal suite name
|
||||
# of a distro release known to debootstrap.
|
||||
#
|
||||
# $arch_$dist_CREATE_OPTS - A bash array containing additional options to pass
|
||||
# verbatim to cowbuilder when this chroot is created
|
||||
# for the first time (using the --create option).
|
||||
# This is useful when options like --othermirror are
|
||||
# wanted to create specialised chroot configurations
|
||||
# such as 'wheezy_backports'.
|
||||
#
|
||||
# $arch_$dist_UPDATE_OPTS - A bash array containing additional options to pass
|
||||
# verbatim to cowbuilder each time the base of this
|
||||
# chroot is updated.
|
||||
#
|
||||
# $arch_$dist_BUILD_OPTS - A bash array containing additional options to pass
|
||||
# verbatim to cowbuilder each time a package build is
|
||||
# performed in this chroot. This is useful when you
|
||||
# want to use some option like --twice which cowpoke
|
||||
# does not directly need to care about.
|
||||
#
|
||||
# Each element in these arrays corresponds to a single argument (in the ARGV
|
||||
# sense) that will be passed to cowbuilder. This ensures that arguments which
|
||||
# may contain whitespace or have strange quoting requirements or other special
|
||||
# characters will not be mangled before they get to cowbuilder.
|
||||
#
|
||||
# Bash arrays are initialised using the following form:
|
||||
# VARIABLE=( "arg1" "arg 2" "--option" "value" "--opt=val" "etc. etc." )
|
||||
#
|
||||
#
|
||||
# $arch_$dist_SIGN_KEYID - An optional arch and dist specific override for
|
||||
# the global SIGN_KEYID option.
|
||||
#
|
||||
# $arch_$dist_UPLOAD_QUEUE - An optional arch and dist specific override for
|
||||
# the global UPLOAD_QUEUE option.
|
||||
#
|
||||
# -----------------------------------------------------------------------------
|
||||
|
||||
#amd64_unstable_RESULT_DIR="$PBUILDER_BASE/amd64/unstable/result"
|
||||
#amd64_unstable_BASE_PATH="$PBUILDER_BASE/amd64/unstable/base.cow"
|
||||
|
||||
#amd64_experimental_RESULT_DIR="$PBUILDER_BASE/amd64/experimental/result"
|
||||
#amd64_experimental_BASE_PATH="$PBUILDER_BASE/amd64/experimental/base.cow"
|
||||
|
||||
#i386_unstable_RESULT_DIR="$PBUILDER_BASE/i386/unstable/result"
|
||||
#i386_unstable_BASE_PATH="$PBUILDER_BASE/i386/unstable/base.cow"
|
||||
|
||||
#i386_experimental_RESULT_DIR="$PBUILDER_BASE/i386/experimental/result"
|
||||
#i386_experimental_BASE_PATH="$PBUILDER_BASE/i386/experimental/base.cow"
|
||||
|
||||
#amd64_wheezy_bpo_BASE_DIST="wheezy"
|
||||
#amd64_wheezy_bpo_CREATE_OPTS=(--othermirror "deb http://deb.debian.org/debian wheezy-backports main")
|
||||
|
29
doc/Makefile
Normal file
29
doc/Makefile
Normal file
|
@ -0,0 +1,29 @@
|
|||
include ../Makefile.common
|
||||
|
||||
all: devscripts.7
|
||||
|
||||
clean:
|
||||
rm -f devscripts.7 devscripts.7.tmp.*
|
||||
|
||||
# There is a slight chance this gets called twice, once here from here and once
|
||||
# from ../po4a/Makefile. Treat files with care.
|
||||
PID := $(shell echo $$$$-$$PPID)
|
||||
devscripts.7: devscripts.7.in ../README.md genmanpage.pl
|
||||
cat $< > $@.tmp.$(PID)
|
||||
cat ../README.md | \
|
||||
awk '/^- annotate-output/,/^ mailing lists./'|sed -e '/^[[:space:]]*$$/d' -e 's/^/ /g' | \
|
||||
perl genmanpage.pl \
|
||||
>> $@.tmp.$(PID)
|
||||
mv $@.tmp.$(PID) $@
|
||||
|
||||
install: install_man1 install_man5 install_man7
|
||||
|
||||
install_man1: *.1
|
||||
install -d "$(DESTDIR)$(MAN1DIR)"
|
||||
install -m0644 -t "$(DESTDIR)$(MAN1DIR)" $^
|
||||
install_man5: *.5
|
||||
install -d "$(DESTDIR)$(MAN5DIR)"
|
||||
install -m0644 -t "$(DESTDIR)$(MAN5DIR)" $^
|
||||
install_man7: *.7 devscripts.7
|
||||
install -d "$(DESTDIR)$(MAN7DIR)"
|
||||
install -m0644 -t "$(DESTDIR)$(MAN7DIR)" $^
|
30
doc/devscripts.7.in
Normal file
30
doc/devscripts.7.in
Normal file
|
@ -0,0 +1,30 @@
|
|||
.TH DEVSCRIPTS 7 "Debian Utilities" "DEBIAN" \" -*- nroff -*-
|
||||
.SH NAME
|
||||
devscripts \- scripts to ease the lives of Debian developers
|
||||
.SH DESCRIPTION
|
||||
The \fBdevscripts\fR package provides a collection of scripts which
|
||||
may be of use to Debian developers and others wishing to build Debian
|
||||
packages. For a summary of the available scripts, please see the file
|
||||
\fI/usr/share/doc/devscripts/README.gz\fR, and for full details, please
|
||||
see the individual manpages. They are contributed by multiple
|
||||
developers; for details of the authors, please see the code or
|
||||
manpages.
|
||||
|
||||
Also, the directory \fI/usr/share/doc/devscripts/examples\fR contains an
|
||||
example \fBexim\fR script for sorting mail arriving to Debian mailing
|
||||
lists.
|
||||
.SH ENVIRONMENT
|
||||
Several scripts of the devscripts suite use the following environment
|
||||
variables. Check the man pages of individual scripts for more details on how the
|
||||
variables are used.
|
||||
.IX Header "ENVIRONMENT"
|
||||
.IP "\s-1DEBEMAIL\s0" 4
|
||||
.IX Item "DEBEMAIL"
|
||||
Email of the person acting on a given Debian package via devscripts.
|
||||
.IP "\s-1DEBFULLNAME\s0" 4
|
||||
.IX Item "DEBFULLNAME"
|
||||
Full name (first + family) of the person acting on a given Debian package via
|
||||
devscripts.
|
||||
.SH SCRIPTS
|
||||
Here is the complete list of available devscripts. See their man pages
|
||||
for additional documentation.
|
60
doc/devscripts.conf.5
Normal file
60
doc/devscripts.conf.5
Normal file
|
@ -0,0 +1,60 @@
|
|||
.TH DEVSCRIPTS.CONF 5 "Debian Utilities" "DEBIAN" \" -*- nroff -*-
|
||||
.SH NAME
|
||||
devscripts.conf \- configuration file for the devscripts package
|
||||
.SH DESCRIPTION
|
||||
The \fBdevscripts\fR package provides a collection of scripts which
|
||||
may be of use to Debian developers and others wishing to build Debian
|
||||
packages. Many of these have options which can be configured on a
|
||||
system-wide and per-user basis.
|
||||
.PP
|
||||
Every script in the \fBdevscripts\fR package which makes use of values
|
||||
from these configuration files describes the specific settings
|
||||
recognised in its own manpage. (For a list of the scripts, either see
|
||||
\fI/usr/share/doc/devscripts/README.gz\fR or look at the output of
|
||||
\fIdpkg \-L devscripts | grep /usr/bin\fR.)
|
||||
.PP
|
||||
The two configuration files are \fI/etc/devscripts.conf\fR for
|
||||
system-wide defaults and \fI~/.devscripts\fR for per-user settings.
|
||||
They are written with \fBbash\fR(1) syntax, but should only have
|
||||
comments and simple variable assignments in them; they are both
|
||||
sourced (if present) by many of the \fBdevscripts\fR scripts.
|
||||
Variables corresponding to simple switches should have one of the
|
||||
values \fIyes\fR and \fIno\fR; any other setting is regarded as
|
||||
equivalent to the default setting.
|
||||
.PP
|
||||
All variable names are written in uppercase, and begin with the script
|
||||
name. Package-wide variables begin with "DEVSCRIPTS", and are listed
|
||||
below, as well as in the relevant manpages.
|
||||
.PP
|
||||
For a list of all of the available options variables, along with their
|
||||
default settings, see the example configuration file
|
||||
\fI/usr/share/doc/devscripts/devscripts.conf.ex\fR. This is copied to
|
||||
\fI/etc/devscripts.conf\fR when the \fBdevscripts\fR package is first
|
||||
installed. Information about configuration options introduced in
|
||||
newer versions of the package will be appended to
|
||||
\fI/etc/devscripts.conf\fR when the package is upgraded.
|
||||
.PP
|
||||
Every script which reads the configuration files can be forced to
|
||||
ignore them by using \fB\-\-no-conf\fR as the \fIfirst\fR command-line
|
||||
option.
|
||||
.SH "PACKAGE-WIDE VARIABLES"
|
||||
The currently recognised package-wide variables are:
|
||||
.TP
|
||||
.BR DEVSCRIPTS_CHECK_DIRNAME_LEVEL ", " DEVSCRIPTS_CHECK_DIRNAME_REGEX
|
||||
These variables control scripts which change directory to find a
|
||||
\fIdebian/changelog\fR file or suchlike, and some other miscellaneous
|
||||
cases. In order to prevent unwanted, even possibly dangerous,
|
||||
behaviour, these variables control when actions will be performed.
|
||||
The scripts which currently make use of these variables are:
|
||||
\fBdebc\fR, \fBdebchange\fR/\fBdch\fR, \fBdebclean\fR, \fBdebi\fR,
|
||||
\fBdebrelease\fR, \fBdebuild\fR and \fBuscan\fR, but this list may
|
||||
change with time (and I may not remember to update this manpage).
|
||||
Please see the manpages of individual scripts for details of the
|
||||
specific behaviour for each script.
|
||||
.SH "SEE ALSO"
|
||||
.BR devscripts (1)
|
||||
and
|
||||
.IR /usr/share/doc/devscripts/README.gz.
|
||||
.SH AUTHOR
|
||||
This manpage was written for the \fBdevscripts\fR package by the
|
||||
package maintainer Julian Gilbey <jdg@debian.org>.
|
43
doc/edit-patch.1
Normal file
43
doc/edit-patch.1
Normal file
|
@ -0,0 +1,43 @@
|
|||
.TH EDIT-PATCH "1" "Debian Utilities" "DEBIAN"
|
||||
.SH NAME
|
||||
\fBedit-patch\fR, \fBadd-patch\fR \- tool for preparing patches for Debian
|
||||
source packages
|
||||
|
||||
.SH SYNOPSIS
|
||||
\fBedit-patch\fR \fIpath/to/patch\fR
|
||||
|
||||
\fBadd-patch\fR \fIpath/to/patch\fR
|
||||
|
||||
.SH DESCRIPTION
|
||||
\fBedit-patch\fR is a wrapper script around the Quilt, CDBS, and dpatch patch
|
||||
systems. It simplifies the process of preparing and editing patches to Debian
|
||||
source packages and allows the user to not have to be concerned with which patch
|
||||
system is in use.
|
||||
Run from inside the root directory of the source package, \fBedit-patch\fR can
|
||||
be used to edit existing patches located in \fIdebian/patches\fR.
|
||||
|
||||
It can also be used to incorporate new patches.
|
||||
If pointed at a patch not already present, it will copy the patch to
|
||||
\fIdebian/patches\fR in the correct format for the patch system in use.
|
||||
Next, the patch is applied and a subshell is opened in order to edit the patch.
|
||||
Typing \fBexit\fR or pressing Ctrl-d will close the subshell and launch an editor
|
||||
to record the \fIdebian/changelog\fR entry.
|
||||
|
||||
\fBedit-patch\fR is integrated with the Bazaar and Git version control systems.
|
||||
The patch will be automatically added to the tree, and the \fIdebian/changelog\fR
|
||||
entry will be used as the commit message.
|
||||
|
||||
If no patch system is present, the patch is applied inline,
|
||||
and a copy is stored in \fIdebian/patches-applied\fR.
|
||||
|
||||
\fBadd-patch\fR is the non-interactive version of \fBedit-patch\fR.
|
||||
The patch will be incorporated but no editor or subshell will be
|
||||
spawned.
|
||||
|
||||
.SH AUTHORS
|
||||
\fBedit-patch\fR was written by Daniel Holbach <daniel.holbach@canonical.com>,
|
||||
Michael Vogt <michael.vogt@canonical.com>, and David Futcher <bobbo@ubuntu.com>.
|
||||
|
||||
This manual page was written by Andrew Starr-Bochicchio <a.starr.b@gmail.com>.
|
||||
.PP
|
||||
Both are released under the terms of the GNU General Public License, version 3.
|
30
doc/genmanpage.pl
Executable file
30
doc/genmanpage.pl
Executable file
|
@ -0,0 +1,30 @@
|
|||
#!/usr/bin/perl
|
||||
|
||||
use strict;
|
||||
use warnings;
|
||||
|
||||
# Define item leadin/leadout for man output
|
||||
my $ITEM_LEADIN = '.IP "\fI';
|
||||
my $ITEM_LEADOUT = '\fR(1)"';
|
||||
|
||||
my $package;
|
||||
my $description;
|
||||
|
||||
|
||||
# Parse the shortened README file
|
||||
while (<>) {
|
||||
chomp;
|
||||
# A line starting with ' -' indicates a script
|
||||
if (/^ - ([^:]*): (.*)/) {
|
||||
if ($package and $description) {
|
||||
# If we get here, then we need to output the man code
|
||||
print $ITEM_LEADIN . $package . $ITEM_LEADOUT . "\n";
|
||||
print $description . "\n";
|
||||
}
|
||||
$package = $1;
|
||||
$description = $2;
|
||||
} else {
|
||||
s/^ //;
|
||||
$description .= $_;
|
||||
}
|
||||
}
|
55
doc/suspicious-source.1
Normal file
55
doc/suspicious-source.1
Normal file
|
@ -0,0 +1,55 @@
|
|||
.\" Copyright (c) 2010, Benjamin Drung <bdrung@debian.org>
|
||||
.\"
|
||||
.\" Permission to use, copy, modify, and/or distribute this software for any
|
||||
.\" purpose with or without fee is hereby granted, provided that the above
|
||||
.\" copyright notice and this permission notice appear in all copies.
|
||||
.\"
|
||||
.\" THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
|
||||
.\" WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
|
||||
.\" MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
|
||||
.\" ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||
.\" WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||
.\" ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
|
||||
.\" OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
.\"
|
||||
.TH SUSPICIOUS\-SOURCE 1 "Debian Utilities" "DEBIAN"
|
||||
|
||||
.SH NAME
|
||||
suspicious\-source \- search for files that do not meet the GPL's
|
||||
definition of "source" for a work
|
||||
|
||||
.SH SYNOPSIS
|
||||
\fBsuspicious\-source\fP [\fIoptions\fR]
|
||||
|
||||
.SH DESCRIPTION
|
||||
\fBsuspicious\-source\fP outputs a list of files which are probably not
|
||||
the source form of a work.
|
||||
This should be run in the root of a source tree to find files which
|
||||
might not be, in the definition from the GNU GPL, the "preferred form
|
||||
of the work for making modifications to it".
|
||||
.PP
|
||||
The files inside version control system directories (like
|
||||
\fI.bzr/\fR or \fICVS/\fR) are not considered.
|
||||
|
||||
.SH OPTIONS
|
||||
.TP
|
||||
\fB\-h\fR, \fB\-\-help\fR
|
||||
Show this help message and exit.
|
||||
.TP
|
||||
\fB\-v\fR, \fB\-\-verbose\fR
|
||||
Print more information.
|
||||
.TP
|
||||
\fB\-d \fIdirectory\fR, \fB\-\-directory=\fIdirectory\fR
|
||||
Check the files in the specified \fIdirectory\fR instead of the current directory.
|
||||
.TP
|
||||
\fB\-m \fImimetype\fR, \fB\-\-mimetype=\fImimetype\fR
|
||||
Add \fImimetype\fR to list of white-listed MIME types.
|
||||
.TP
|
||||
\fB\-e \fIextension\fR, \fB\-\-extension=\fIextension\fR
|
||||
Add \fIextension\fR to list of white-listed extensions.
|
||||
|
||||
.SH AUTHORS
|
||||
\fBsuspicious\-source\fP and this manpage have been written by
|
||||
Benjamin Drung <bdrung@debian.org>.
|
||||
.PP
|
||||
Both are released under the ISC license.
|
37
doc/what-patch.1
Normal file
37
doc/what-patch.1
Normal file
|
@ -0,0 +1,37 @@
|
|||
.TH WHAT\-PATCH "1" "Debian Utilities" "DEBIAN"
|
||||
.SH NAME
|
||||
what\-patch \- detect which patch system a Debian package uses
|
||||
|
||||
.SH SYNOPSIS
|
||||
.B what\-patch\fR [\fIoptions\fR]
|
||||
|
||||
.SH DESCRIPTION
|
||||
\fBwhat\-patch\fR examines the \fIdebian/rules\fR file to determine which patch
|
||||
system the Debian package is using.
|
||||
.PP
|
||||
\fBwhat\-patch\fR should be run from the root directory of the Debian source
|
||||
package.
|
||||
|
||||
.SH OPTIONS
|
||||
Listed below are the command line options for \fBwhat\-patch\fR:
|
||||
.TP
|
||||
.BR \-h ", " \-\-help
|
||||
Display a help message and exit.
|
||||
.TP
|
||||
.B \-v
|
||||
Enable verbose mode.
|
||||
This will include the listing of any files modified outside or the \fIdebian/\fR
|
||||
directory and report any additional details about the patch system if
|
||||
available.
|
||||
|
||||
.SH AUTHORS
|
||||
\fBwhat\-patch\fR was written by Kees Cook <kees@ubuntu.com>,
|
||||
Siegfried-A. Gevatter <rainct@ubuntu.com>, and Daniel Hahler
|
||||
<ubuntu@thequod.de>, among others.
|
||||
This manual page was written by Jonathan Patrick Davies <jpds@ubuntu.com>.
|
||||
.PP
|
||||
Both are released under the GNU General Public License, version 3 or later.
|
||||
|
||||
.SH SEE ALSO
|
||||
The Ubuntu MOTU team has some documentation about patch systems at the Ubuntu
|
||||
wiki: \fIhttps://wiki.ubuntu.com/PackagingGuide/PatchSystems\fR
|
96
doc/wrap-and-sort.1
Normal file
96
doc/wrap-and-sort.1
Normal file
|
@ -0,0 +1,96 @@
|
|||
.\" Copyright (c) 2010, Benjamin Drung <bdrung@debian.org>
|
||||
.\"
|
||||
.\" Permission to use, copy, modify, and/or distribute this software for any
|
||||
.\" purpose with or without fee is hereby granted, provided that the above
|
||||
.\" copyright notice and this permission notice appear in all copies.
|
||||
.\"
|
||||
.\" THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
|
||||
.\" WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
|
||||
.\" MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
|
||||
.\" ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||
.\" WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||
.\" ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
|
||||
.\" OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
.\"
|
||||
.TH WRAP\-AND\-SORT 1 "Debian Utilities" "DEBIAN"
|
||||
.SH NAME
|
||||
wrap-and-sort \- wrap long lines and sort items in Debian packaging files
|
||||
.SH SYNOPSIS
|
||||
.B wrap-and-sort
|
||||
[\fIoptions\fR]
|
||||
|
||||
.SH DESCRIPTION
|
||||
\fBwrap\-and\-sort\fP wraps the package lists in Debian control files. By
|
||||
default the lists will only split into multiple lines if the entries are longer
|
||||
than the maximum line length limit of 79 characters. \fBwrap\-and\-sort\fP sorts
|
||||
the package lists in Debian control files and all \fI.dirs\fR, \fI.docs\fR,
|
||||
\fI.examples\fR, \fI.info\fR, \fI.install\fR, \fI.links\fR, \fI.maintscript\fR,
|
||||
and \fI.manpages\fR files. Beside that \fBwrap\-and\-sort\fP removes trailing
|
||||
spaces in these files.
|
||||
.PP
|
||||
This script should be run in the root of a Debian package tree. It searches for
|
||||
\fIcontrol\fR, \fIcontrol*.in\fR, \fIcopyright\fR, \fIcopyright.in\fR,
|
||||
\fIinstall\fR, and \fI*.install\fR in the \fIdebian\fR directory.
|
||||
|
||||
.SH OPTIONS
|
||||
.TP
|
||||
\fB\-h\fR, \fB\-\-help\fR
|
||||
Show this help message and exit. Will also print the default values for
|
||||
the options below.
|
||||
.TP
|
||||
\fB\-a\fR, \fB\-\-[no\-]wrap\-always\fR
|
||||
Wrap all package lists in the Debian \fIcontrol\fR file
|
||||
even if they do not exceed the line length limit and could fit in one line.
|
||||
.TP
|
||||
\fB\-s\fR, \fB\-\-[no\-]short\-indent\fR
|
||||
Indent wrapped lines by a single space, instead of in\-line with the
|
||||
field name.
|
||||
.TP
|
||||
\fB\-b\fR, \fB\-\-[no\-]sort\-binary\-packages\fR
|
||||
Sort binary package paragraphs by name.
|
||||
.TP
|
||||
\fB\-k\fR, \fB\-\-[no-]keep\-first\fR
|
||||
When sorting binary package paragraphs, leave the first one at the top.
|
||||
|
||||
Unqualified
|
||||
.BR debhelper (7)
|
||||
configuration files are applied to the first package.
|
||||
.TP
|
||||
\fB\-n\fR, \fB\-\-[no\-]cleanup\fR
|
||||
Remove trailing whitespaces.
|
||||
.TP
|
||||
\fB\-t\fR, \fB\-\-[no\-]trailing\-comma\fR
|
||||
Add a trailing comma at the end of the sorted fields.
|
||||
This minimizes future differences in the VCS commits when additional
|
||||
dependencies are appended or removed.
|
||||
.TP
|
||||
\fB\-d \fIpath\fR, \fB\-\-debian\-directory=\fIpath\fR
|
||||
Location of the \fIdebian\fR directory (default: \fI./debian\fR).
|
||||
.TP
|
||||
\fB\-f \fIfile\fR, \fB\-\-file=\fIfile\fR
|
||||
Wrap and sort only the specified \fIfile\fR.
|
||||
You can specify this parameter multiple times.
|
||||
All supported files will be processed if no files are specified.
|
||||
.TP
|
||||
\fB\-v\fR, \fB\-\-verbose\fR
|
||||
Print all files that are touched.
|
||||
.TP
|
||||
\fB\-\-max\-line\-length=\fImax_line_length\fR
|
||||
Set the maximum allowed line length. Package lists in the Debian \fIcontrol\fR
|
||||
file that exceed this length limit will be wrapped.
|
||||
|
||||
The default maximum line length is 79 characters.
|
||||
.TP
|
||||
\fB\-N\fR, \fB\-\-dry\-run\fR
|
||||
Do not modify any file, instead only print the files that would be modified.
|
||||
|
||||
.TP
|
||||
\fB\-\-experimental\-rts\-parser\fR
|
||||
Temporary option accepted for compatibility with an experiment. It no longer
|
||||
does anything. Please remove any use of it.
|
||||
|
||||
.SH AUTHORS
|
||||
\fBwrap\-and\-sort\fP and this manpage have been written by
|
||||
Benjamin Drung <bdrung@debian.org>.
|
||||
.PP
|
||||
Both are released under the ISC license.
|
20
examples/debbisect_buildsrc.sh
Executable file
20
examples/debbisect_buildsrc.sh
Executable file
|
@ -0,0 +1,20 @@
|
|||
#!/bin/sh
|
||||
#
|
||||
# use this script to build a source package with debbisect like this:
|
||||
#
|
||||
# $ DEBIAN_BISECT_SRCPKG=mysrc ./debbisect --cache=./cache "two years ago" yesterday /usr/share/doc/devscripts/examples/debbisect_buildsrc.sh
|
||||
#
|
||||
# copy this script and edit it if you want to customize it
|
||||
|
||||
set -eu
|
||||
|
||||
mmdebstrap --variant=apt unstable \
|
||||
--aptopt='Apt::Key::gpgvcommand "/usr/share/debuerreotype/scripts/.gpgv-ignore-expiration.sh"' \
|
||||
--aptopt='Acquire::Check-Valid-Until "false"' \
|
||||
--customize-hook='chroot "$1" apt-get --yes build-dep '"$DEBIAN_BISECT_SRCPKG" \
|
||||
--customize-hook='chroot "$1" sh -c "dpkg-query -W > /pkglist"' \
|
||||
--customize-hook='download /pkglist ./debbisect.'"$DEBIAN_BISECT_TIMESTAMP"'.pkglist' \
|
||||
--customize-hook='rm "$1"/pkglist' \
|
||||
--customize-hook="chroot \"\$1\" dpkg-query --showformat '\${binary:Package}=\${Version}\n' --show" \
|
||||
--customize-hook='chroot "$1" apt-get source --build '"$DEBIAN_BISECT_SRCPKG" \
|
||||
/dev/null $DEBIAN_BISECT_MIRROR "deb-src $DEBIAN_BISECT_MIRROR unstable main"
|
6
examples/forward.exim
Normal file
6
examples/forward.exim
Normal file
|
@ -0,0 +1,6 @@
|
|||
# Exim Filter <<== do not edit or remove this line!
|
||||
# Assortment of debian lists
|
||||
if $header_resent-sender: matches "debian-(.*)-request@"
|
||||
then
|
||||
save $home/mail/debian-$1
|
||||
endif
|
141
lib/Devscripts/Compression.pm
Normal file
141
lib/Devscripts/Compression.pm
Normal file
|
@ -0,0 +1,141 @@
|
|||
# Copyright James McCoy <jamessan@debian.org> 2013.
|
||||
# Modifications copyright 2002 Julian Gilbey <jdg@debian.org>
|
||||
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation; either version 2 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
package Devscripts::Compression;
|
||||
|
||||
use Dpkg::Compression qw(
|
||||
!compression_get_file_extension
|
||||
!compression_get_cmdline_compress
|
||||
!compression_get_cmdline_decompress
|
||||
);
|
||||
use Dpkg::IPC;
|
||||
use Exporter qw(import);
|
||||
|
||||
our @EXPORT = (
|
||||
@Dpkg::Compression::EXPORT,
|
||||
qw(
|
||||
compression_get_file_extension
|
||||
compression_get_cmdline_compress
|
||||
compression_get_cmdline_decompress
|
||||
compression_guess_from_file
|
||||
),
|
||||
);
|
||||
|
||||
eval {
|
||||
Dpkg::Compression->VERSION(2.01);
|
||||
1;
|
||||
} or do {
|
||||
# Ensure we have the compression getters, regardless of the version of
|
||||
# Dpkg::Compression to ease backporting.
|
||||
*{'Dpkg::Compression::compression_get_file_extension'} = sub {
|
||||
my $comp = shift;
|
||||
return compression_get_property($comp, 'file_ext');
|
||||
};
|
||||
*{'Dpkg::Compression::compression_get_cmdline_compress'} = sub {
|
||||
my $comp = shift;
|
||||
my @prog = @{ compression_get_property($comp, 'comp_prog') };
|
||||
push @prog, '-' . compression_get_property($comp, 'default_level');
|
||||
return @prog;
|
||||
};
|
||||
*{'Dpkg::Compression::compression_get_cmdline_decompress'} = sub {
|
||||
my $comp = shift;
|
||||
my @prog = @{ compression_get_property($comp, 'decomp_prog') };
|
||||
return @prog;
|
||||
};
|
||||
};
|
||||
|
||||
# This can potentially be moved to Dpkg::Compression
|
||||
|
||||
my %mime2comp = (
|
||||
"application/x-gzip" => "gzip",
|
||||
"application/gzip" => "gzip",
|
||||
"application/x-bzip2" => "bzip2",
|
||||
"application/bzip2 " => "bzip2",
|
||||
"application/x-xz" => "xz",
|
||||
"application/xz" => "xz",
|
||||
"application/zip" => "zip",
|
||||
"application/x-compress" => "compress",
|
||||
"application/java-archive" => "zip",
|
||||
"application/x-tar" => "tar",
|
||||
"application/zstd" => "zst",
|
||||
"application/x-zstd" => "zst",
|
||||
"application/x-lzip" => "lzip",
|
||||
);
|
||||
|
||||
sub compression_guess_from_file {
|
||||
my $filename = shift;
|
||||
my $mimetype;
|
||||
spawn(
|
||||
exec => ['file', '--dereference', '--brief', '--mime-type', $filename],
|
||||
to_string => \$mimetype,
|
||||
wait_child => 1
|
||||
);
|
||||
chomp($mimetype);
|
||||
if (exists $mime2comp{$mimetype}) {
|
||||
return $mime2comp{$mimetype};
|
||||
} else {
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
# comp_prog and default_level aren't provided because a) they aren't needed in
|
||||
# devscripts and b) the Dpkg::Compression API isn't rich enough to support
|
||||
# these as compressors
|
||||
my %comp_properties = (
|
||||
compress => {
|
||||
file_ext => 'Z',
|
||||
decomp_prog => ['uncompress'],
|
||||
},
|
||||
lzip => {
|
||||
file_ext => 'lz',
|
||||
decomp_prog => ['lzip', '--decompress', '--keep'],
|
||||
},
|
||||
zip => {
|
||||
file_ext => 'zip',
|
||||
decomp_prog => ['unzip'],
|
||||
},
|
||||
zst => {
|
||||
file_ext => 'zst',
|
||||
#comp_prog => ['zstd'],
|
||||
decomp_prog => ['unzstd'],
|
||||
default_level => 3,
|
||||
});
|
||||
|
||||
sub compression_get_file_extension {
|
||||
my $comp = shift;
|
||||
if (!exists $comp_properties{$comp}) {
|
||||
return Dpkg::Compression::compression_get_file_extension($comp);
|
||||
}
|
||||
return $comp_properties{$comp}{file_ext};
|
||||
}
|
||||
|
||||
sub compression_get_cmdline_compress {
|
||||
my $comp = shift;
|
||||
if (!exists $comp_properties{$comp}) {
|
||||
return Dpkg::Compression::compression_get_cmdline_compress($comp);
|
||||
}
|
||||
return @{ $comp_properties{$comp}{comp_prog} };
|
||||
}
|
||||
|
||||
sub compression_get_cmdline_decompress {
|
||||
my $comp = shift;
|
||||
if (!exists $comp_properties{$comp}) {
|
||||
return Dpkg::Compression::compression_get_cmdline_decompress($comp);
|
||||
}
|
||||
return @{ $comp_properties{$comp}{decomp_prog} };
|
||||
}
|
||||
|
||||
1;
|
418
lib/Devscripts/Config.pm
Normal file
418
lib/Devscripts/Config.pm
Normal file
|
@ -0,0 +1,418 @@
|
|||
|
||||
=head1 NAME
|
||||
|
||||
Devscripts::Config - devscripts Perl scripts configuration object
|
||||
|
||||
=head1 SYNOPSIS
|
||||
|
||||
# Configuration module
|
||||
package Devscripts::My::Config;
|
||||
use Moo;
|
||||
extends 'Devscripts::Config';
|
||||
|
||||
use constant keys => [
|
||||
[ 'text1=s', 'MY_TEXT', qr/^\S/, 'Default_text' ],
|
||||
# ...
|
||||
];
|
||||
|
||||
has text1 => ( is => 'rw' );
|
||||
|
||||
# Main package or script
|
||||
package Devscripts::My;
|
||||
|
||||
use Moo;
|
||||
my $config = Devscripts::My::Config->new->parse;
|
||||
1;
|
||||
|
||||
=head1 DESCRIPTION
|
||||
|
||||
Devscripts Perl scripts configuration object. It can scan configuration files
|
||||
(B</etc/devscripts.conf> and B<~/.devscripts>) and command line arguments.
|
||||
|
||||
A devscripts configuration package has just to declare:
|
||||
|
||||
=over
|
||||
|
||||
=item B<keys> constant: array ref I<(see below)>
|
||||
|
||||
=item B<rules> constant: hash ref I<(see below)>
|
||||
|
||||
=back
|
||||
|
||||
=head1 KEYS
|
||||
|
||||
Each element of B<keys> constant is an array containing four elements which can
|
||||
be undefined:
|
||||
|
||||
=over
|
||||
|
||||
=item the string to give to L<Getopt::Long>
|
||||
|
||||
=item the name of the B<devscripts.conf> key
|
||||
|
||||
=item the rule to check value. It can be:
|
||||
|
||||
=over
|
||||
|
||||
=item B<regexp> ref: will be applied to the value. If it fails against the
|
||||
devscripts.conf value, Devscripts::Config will warn. If it fails against the
|
||||
command line argument, Devscripts::Config will die.
|
||||
|
||||
=item B<sub> ref: function will be called with 2 arguments: current config
|
||||
object and proposed value. Function must return a true value to continue or
|
||||
0 to stop. This is not simply a "check" function: Devscripts::Config will not
|
||||
do anything else than read the result to continue with next argument or stop.
|
||||
|
||||
=item B<"bool"> string: means that value is a boolean. devscripts.conf value
|
||||
can be either "yes", 1, "no", 0.
|
||||
|
||||
=back
|
||||
|
||||
=item the default value
|
||||
|
||||
=back
|
||||
|
||||
=head2 RULES
|
||||
|
||||
It is possible to declare some additional rules to check the logic between
|
||||
options:
|
||||
|
||||
use constant rules => [
|
||||
sub {
|
||||
my($self)=@_;
|
||||
# OK
|
||||
return 1 if( $self->a < $self->b );
|
||||
# OK with warning
|
||||
return ( 1, 'a should be lower than b ) if( $self->a > $self->b );
|
||||
# NOK with an error
|
||||
return ( 0, 'a must not be equal to b !' );
|
||||
},
|
||||
sub {
|
||||
my($self)=@_;
|
||||
# ...
|
||||
return 1;
|
||||
},
|
||||
];
|
||||
|
||||
=head1 METHODS
|
||||
|
||||
=head2 new()
|
||||
|
||||
Constructor
|
||||
|
||||
=cut
|
||||
|
||||
package Devscripts::Config;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Dpkg::IPC;
|
||||
use File::HomeDir;
|
||||
use Getopt::Long qw(:config bundling permute no_getopt_compat);
|
||||
use Moo;
|
||||
|
||||
# Common options
|
||||
has common_opts => (
|
||||
is => 'ro',
|
||||
default => sub {
|
||||
[[
|
||||
'help', undef,
|
||||
sub {
|
||||
if ($_[1]) { $_[0]->usage; exit 0 }
|
||||
}
|
||||
]]
|
||||
});
|
||||
|
||||
# Internal attributes
|
||||
|
||||
has modified_conf_msg => (is => 'rw', default => sub { '' });
|
||||
|
||||
$ENV{HOME} = File::HomeDir->my_home;
|
||||
|
||||
our @config_files
|
||||
= ('/etc/devscripts.conf', ($ENV{HOME} ? "$ENV{HOME}/.devscripts" : ()));
|
||||
|
||||
sub keys {
|
||||
die "conffile_keys() must be defined in sub classes";
|
||||
}
|
||||
|
||||
=head2 parse()
|
||||
|
||||
Launches B<parse_conf_files()>, B<parse_command_line()> and B<check_rules>
|
||||
|
||||
=cut
|
||||
|
||||
sub BUILD {
|
||||
my ($self) = @_;
|
||||
$self->set_default;
|
||||
}
|
||||
|
||||
sub parse {
|
||||
my ($self) = @_;
|
||||
|
||||
# 1 - Parse /etc/devscripts.conf and ~/.devscripts
|
||||
$self->parse_conf_files;
|
||||
|
||||
# 2 - Parse command line
|
||||
$self->parse_command_line;
|
||||
|
||||
# 3 - Check rules
|
||||
$self->check_rules;
|
||||
return $self;
|
||||
}
|
||||
|
||||
# I - Parse /etc/devscripts.conf and ~/.devscripts
|
||||
|
||||
=head2 parse_conf_files()
|
||||
|
||||
Reads values in B</etc/devscripts.conf> and B<~/.devscripts>
|
||||
|
||||
=cut
|
||||
|
||||
sub set_default {
|
||||
my ($self) = @_;
|
||||
my $keys = $self->keys;
|
||||
foreach my $key (@$keys) {
|
||||
my ($kname, $name, $check, $default) = @$key;
|
||||
next unless (defined $default);
|
||||
$kname =~ s/^\-\-//;
|
||||
$kname =~ s/-/_/g;
|
||||
$kname =~ s/[!\|=].*$//;
|
||||
if (ref $default) {
|
||||
unless (ref $default eq 'CODE') {
|
||||
die "Default value must be a sub ($kname)";
|
||||
}
|
||||
$self->{$kname} = $default->();
|
||||
} else {
|
||||
$self->{$kname} = $default;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
sub parse_conf_files {
|
||||
my ($self) = @_;
|
||||
|
||||
my @cfg_files = @config_files;
|
||||
if (@ARGV) {
|
||||
if ($ARGV[0] =~ /^--no-?conf$/) {
|
||||
$self->modified_conf_msg(" (no configuration files read)");
|
||||
shift @ARGV;
|
||||
return $self;
|
||||
}
|
||||
my @tmp;
|
||||
while ($ARGV[0] and $ARGV[0] =~ s/^--conf-?file(?:=(.+))?//) {
|
||||
shift @ARGV;
|
||||
my $file = $1 || shift(@ARGV);
|
||||
if ($file) {
|
||||
unless ($file =~ s/^\+//) {
|
||||
@cfg_files = ();
|
||||
}
|
||||
push @tmp, $file;
|
||||
} else {
|
||||
return ds_die
|
||||
"Unable to parse --conf-file option, aborting parsing";
|
||||
}
|
||||
}
|
||||
push @cfg_files, @tmp;
|
||||
}
|
||||
|
||||
@cfg_files = grep { -r $_ } @cfg_files;
|
||||
my $keys = $self->keys;
|
||||
if (@cfg_files) {
|
||||
my @key_names = map { $_->[1] ? $_->[1] : () } @$keys;
|
||||
my %config_vars;
|
||||
|
||||
my $shell_cmd = q{for file ; do . "$file"; done ;};
|
||||
|
||||
# Read back values
|
||||
$shell_cmd .= q{ printf '%s\0' };
|
||||
my @shell_key_names = map { qq{"\$$_"} } @key_names;
|
||||
$shell_cmd .= join(' ', @shell_key_names);
|
||||
my $shell_out;
|
||||
spawn(
|
||||
exec => [
|
||||
'/bin/bash', '-c',
|
||||
$shell_cmd, 'devscripts-config-loader',
|
||||
@cfg_files
|
||||
],
|
||||
wait_child => 1,
|
||||
to_string => \$shell_out
|
||||
);
|
||||
@config_vars{@key_names} = map { s/^\s*(.*?)\s*/$1/ ? $_ : undef }
|
||||
split(/\0/, $shell_out, -1);
|
||||
|
||||
# Check validity and set value
|
||||
foreach my $key (@$keys) {
|
||||
my ($kname, $name, $check, $default) = @$key;
|
||||
next unless ($name);
|
||||
$kname //= '';
|
||||
$kname =~ s/^\-\-//;
|
||||
$kname =~ s/-/_/g;
|
||||
$kname =~ s/[!|=+].*$//;
|
||||
# Case 1: nothing in conf files, set default
|
||||
next unless (length $config_vars{$name});
|
||||
if (defined $check) {
|
||||
if (not(ref $check)) {
|
||||
$check
|
||||
= $self->_subs_check($check, $kname, $name, $default);
|
||||
}
|
||||
if (ref $check eq 'CODE') {
|
||||
my ($res, $msg)
|
||||
= $check->($self, $config_vars{$name}, $kname);
|
||||
ds_warn $msg unless ($res);
|
||||
next;
|
||||
} elsif (ref $check eq 'Regexp') {
|
||||
unless ($config_vars{$name} =~ $check) {
|
||||
ds_warn "Bad $name value $config_vars{$name}";
|
||||
next;
|
||||
}
|
||||
} else {
|
||||
ds_die "Unknown check type for $name";
|
||||
return undef;
|
||||
}
|
||||
}
|
||||
$self->{$kname} = $config_vars{$name};
|
||||
$self->{modified_conf_msg} .= " $name=$config_vars{$name}\n";
|
||||
if (ref $default) {
|
||||
my $ref = ref $default->();
|
||||
my @tmp = ($config_vars{$name} =~ /\s+"([^"]*)"(?>\s+)/g);
|
||||
$config_vars{$name} =~ s/\s+"([^"]*)"\s+/ /g;
|
||||
push @tmp, split(/\s+/, $config_vars{$name});
|
||||
if ($ref eq 'ARRAY') {
|
||||
$self->{$kname} = \@tmp;
|
||||
} elsif ($ref eq 'HASH') {
|
||||
$self->{$kname}
|
||||
= { map { /^(.*?)=(.*)$/ ? ($1 => $2) : ($_ => 1) }
|
||||
@tmp };
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return $self;
|
||||
}
|
||||
|
||||
# II - Parse command line
|
||||
|
||||
=head2 parse_command_line()
|
||||
|
||||
Parse command line arguments
|
||||
|
||||
=cut
|
||||
|
||||
sub parse_command_line {
|
||||
my ($self, @arrays) = @_;
|
||||
my $opts = {};
|
||||
my $keys = [@{ $self->common_opts }, @{ $self->keys }];
|
||||
# If default value is set to [], we must prepare hash ref to be able to
|
||||
# receive more than one value
|
||||
foreach (@$keys) {
|
||||
if ($_->[3] and ref($_->[3])) {
|
||||
my $kname = $_->[0];
|
||||
$kname =~ s/[!\|=].*$//;
|
||||
$opts->{$kname} = $_->[3]->();
|
||||
}
|
||||
}
|
||||
unless (GetOptions($opts, map { $_->[0] ? ($_->[0]) : () } @$keys)) {
|
||||
$_[0]->usage;
|
||||
exit 1;
|
||||
}
|
||||
foreach my $key (@$keys) {
|
||||
my ($kname, $tmp, $check, $default) = @$key;
|
||||
next unless ($kname);
|
||||
$kname =~ s/[!|=+].*$//;
|
||||
my $name = $kname;
|
||||
$kname =~ s/-/_/g;
|
||||
if (defined $opts->{$name}) {
|
||||
next if (ref $opts->{$name} eq 'ARRAY' and !@{ $opts->{$name} });
|
||||
next if (ref $opts->{$name} eq 'HASH' and !%{ $opts->{$name} });
|
||||
if (defined $check) {
|
||||
if (not(ref $check)) {
|
||||
$check
|
||||
= $self->_subs_check($check, $kname, $name, $default);
|
||||
}
|
||||
if (ref $check eq 'CODE') {
|
||||
my ($res, $msg) = $check->($self, $opts->{$name}, $kname);
|
||||
ds_die "Bad value for $name: $msg" unless ($res);
|
||||
} elsif (ref $check eq 'Regexp') {
|
||||
if ($opts->{$name} =~ $check) {
|
||||
$self->{$kname} = $opts->{$name};
|
||||
} else {
|
||||
ds_die "Bad $name value in command line";
|
||||
}
|
||||
} else {
|
||||
ds_die "Unknown check type for $name";
|
||||
}
|
||||
} else {
|
||||
$self->{$kname} = $opts->{$name};
|
||||
}
|
||||
}
|
||||
}
|
||||
return $self;
|
||||
}
|
||||
|
||||
sub check_rules {
|
||||
my ($self) = @_;
|
||||
if ($self->can('rules')) {
|
||||
if (my $rules = $self->rules) {
|
||||
my $i = 0;
|
||||
foreach my $sub (@$rules) {
|
||||
$i++;
|
||||
my ($res, $msg) = $sub->($self);
|
||||
if ($res) {
|
||||
ds_warn($msg) if ($msg);
|
||||
} else {
|
||||
ds_error($msg || "config rule $i");
|
||||
# ds_error may not die if $Devscripts::Output::die_on_error
|
||||
# is set to 0
|
||||
next;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return $self;
|
||||
}
|
||||
|
||||
sub _subs_check {
|
||||
my ($self, $check, $kname, $name, $default) = @_;
|
||||
if ($check eq 'bool') {
|
||||
$check = sub {
|
||||
$_[0]->{$kname} = (
|
||||
$_[1] =~ /^(?:1|yes)$/i ? 1
|
||||
: $_[1] =~ /^(?:0|no)$/i ? 0
|
||||
: $default ? $default
|
||||
: undef
|
||||
);
|
||||
return 1;
|
||||
};
|
||||
} else {
|
||||
$self->die("Unknown check type for $name");
|
||||
}
|
||||
return $check;
|
||||
}
|
||||
|
||||
# Default usage: switch to manpage
|
||||
sub usage {
|
||||
$progname =~ s/\.pl//;
|
||||
exec("man", '-P', '/bin/cat', $progname);
|
||||
}
|
||||
|
||||
1;
|
||||
__END__
|
||||
=head1 SEE ALSO
|
||||
|
||||
L<devscripts>
|
||||
|
||||
=head1 AUTHOR
|
||||
|
||||
Xavier Guimard E<lt>yadd@debian.orgE<gt>
|
||||
|
||||
=head1 COPYRIGHT AND LICENSE
|
||||
|
||||
Copyright 2018 by Xavier Guimard <yadd@debian.org>
|
||||
|
||||
This program is free software; you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation; either version 2 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
=cut
|
364
lib/Devscripts/DB_File_Lock.pm
Normal file
364
lib/Devscripts/DB_File_Lock.pm
Normal file
|
@ -0,0 +1,364 @@
|
|||
#
|
||||
# DB_File::Lock
|
||||
#
|
||||
# by David Harris <dharris@drh.net>
|
||||
#
|
||||
# Copyright (c) 1999-2000 David R. Harris. All rights reserved.
|
||||
# This program is free software; you can redistribute it and/or modify it
|
||||
# under the same terms as Perl itself.
|
||||
#
|
||||
|
||||
# We rename the package so that we don't have to package it separately.
|
||||
# package DB_File::Lock;
|
||||
package Devscripts::DB_File_Lock;
|
||||
|
||||
require 5.004;
|
||||
|
||||
use strict;
|
||||
use vars qw($VERSION @ISA $locks);
|
||||
|
||||
@ISA = qw(DB_File);
|
||||
$VERSION = '0.05';
|
||||
|
||||
use DB_File ();
|
||||
use Fcntl qw(:flock O_RDWR O_RDONLY O_WRONLY O_CREAT);
|
||||
use Carp qw(croak carp);
|
||||
use Symbol ();
|
||||
|
||||
# import function can't be inherited, so this magic required
|
||||
sub import {
|
||||
my $ourname = shift;
|
||||
my @imports
|
||||
= @_; # dynamic scoped var, still in scope after package call in eval
|
||||
my $module = caller;
|
||||
my $calling = $ISA[0];
|
||||
eval " package $module; import $calling, \@imports; ";
|
||||
}
|
||||
|
||||
sub _lock_and_tie {
|
||||
my $package = shift;
|
||||
|
||||
## Grab the type of tie
|
||||
|
||||
my $tie_type = pop @_;
|
||||
|
||||
## There are two ways of passing data defined by DB_File
|
||||
|
||||
my $lock_data;
|
||||
my @dbfile_data;
|
||||
|
||||
if (@_ == 5) {
|
||||
$lock_data = pop @_;
|
||||
@dbfile_data = @_;
|
||||
} elsif (@_ == 2) {
|
||||
$lock_data = pop @_;
|
||||
@dbfile_data = @{ $_[0] };
|
||||
} else {
|
||||
croak "invalid number of arguments";
|
||||
}
|
||||
|
||||
## Decipher the lock_data
|
||||
|
||||
my $mode;
|
||||
my $nonblocking = 0;
|
||||
my $lockfile_name = $dbfile_data[0] . ".lock";
|
||||
my $lockfile_mode;
|
||||
|
||||
if (lc($lock_data) eq "read") {
|
||||
$mode = "read";
|
||||
} elsif (lc($lock_data) eq "write") {
|
||||
$mode = "write";
|
||||
} elsif (ref($lock_data) eq "HASH") {
|
||||
$mode = lc $lock_data->{mode};
|
||||
croak "invalid mode ($mode)" if ($mode ne "read" and $mode ne "write");
|
||||
$nonblocking = $lock_data->{nonblocking};
|
||||
$lockfile_name = $lock_data->{lockfile_name}
|
||||
if (defined $lock_data->{lockfile_name});
|
||||
$lockfile_mode = $lock_data->{lockfile_mode};
|
||||
} else {
|
||||
croak "invalid lock_data ($lock_data)";
|
||||
}
|
||||
|
||||
## Warn about opening a lockfile for writing when only locking for reading
|
||||
|
||||
# NOTE: This warning disabled for RECNO because RECNO seems to require O_RDWR
|
||||
# even when opening only for reading.
|
||||
|
||||
carp
|
||||
"opening with write access when locking only for reading (use O_RDONLY to fix)"
|
||||
if ((
|
||||
$dbfile_data[1] && O_RDWR
|
||||
or $dbfile_data[1] && O_WRONLY
|
||||
) # any kind of write access
|
||||
and $mode eq "read" # and opening for reading
|
||||
and $tie_type ne "TIEARRAY" # and not RECNO
|
||||
);
|
||||
|
||||
## Determine the mode of the lockfile, if not given
|
||||
|
||||
# THEORY: if someone can read or write the database file, we must allow
|
||||
# them to read and write the lockfile.
|
||||
|
||||
if (not defined $lockfile_mode) {
|
||||
$lockfile_mode = 0600; # we must be allowed to read/write lockfile
|
||||
$lockfile_mode |= 0060 if ($dbfile_data[2] & 0060);
|
||||
$lockfile_mode |= 0006 if ($dbfile_data[2] & 0006);
|
||||
}
|
||||
|
||||
## Open the lockfile, lock it, and open the database
|
||||
|
||||
my $lockfile_fh = Symbol::gensym();
|
||||
my $saved_umask = umask(0000) if (umask() & $lockfile_mode);
|
||||
my $open_ok = sysopen($lockfile_fh, $lockfile_name, O_RDWR | O_CREAT,
|
||||
$lockfile_mode);
|
||||
umask($saved_umask) if (defined $saved_umask);
|
||||
$open_ok or croak "could not open lockfile ($lockfile_name)";
|
||||
|
||||
my $flock_flags
|
||||
= ($mode eq "write" ? LOCK_EX : LOCK_SH) | ($nonblocking ? LOCK_NB : 0);
|
||||
if (not flock $lockfile_fh, $flock_flags) {
|
||||
close $lockfile_fh;
|
||||
return undef if ($nonblocking);
|
||||
croak "could not flock lockfile";
|
||||
}
|
||||
|
||||
my $self
|
||||
= $tie_type eq "TIEHASH"
|
||||
? $package->SUPER::TIEHASH(@_)
|
||||
: $package->SUPER::TIEARRAY(@_);
|
||||
if (not $self) {
|
||||
close $lockfile_fh;
|
||||
return $self;
|
||||
}
|
||||
|
||||
## Store the info for the DESTROY function
|
||||
|
||||
my $id = "" . $self;
|
||||
$id =~ s/^[^=]+=//; # remove the package name in case re-blessing occurs
|
||||
$locks->{$id} = $lockfile_fh;
|
||||
|
||||
## Return the object
|
||||
|
||||
return $self;
|
||||
}
|
||||
|
||||
sub TIEHASH {
|
||||
return _lock_and_tie(@_, 'TIEHASH');
|
||||
}
|
||||
|
||||
sub TIEARRAY {
|
||||
return _lock_and_tie(@_, 'TIEARRAY');
|
||||
}
|
||||
|
||||
sub DESTROY {
|
||||
my $self = shift;
|
||||
|
||||
my $id = "" . $self;
|
||||
$id =~ s/^[^=]+=//;
|
||||
my $lockfile_fh = $locks->{$id};
|
||||
delete $locks->{$id};
|
||||
|
||||
$self->SUPER::DESTROY(@_);
|
||||
|
||||
# un-flock not needed, as we close here
|
||||
close $lockfile_fh;
|
||||
}
|
||||
|
||||
1;
|
||||
__END__
|
||||
|
||||
=head1 NAME
|
||||
|
||||
DB_File::Lock - Locking with flock wrapper for DB_File
|
||||
|
||||
=head1 SYNOPSIS
|
||||
|
||||
use DB_File::Lock;
|
||||
use Fcntl qw(:flock O_RDWR O_CREAT);
|
||||
|
||||
$locking = "read";
|
||||
$locking = "write";
|
||||
$locking = {
|
||||
mode => "read",
|
||||
nonblocking => 0,
|
||||
lockfile_name => "/path/to/shared.lock",
|
||||
lockfile_mode => 0600,
|
||||
};
|
||||
|
||||
[$X =] tie %hash, 'DB_File::Lock', $filename, $flags, $mode, $DB_HASH, $locking;
|
||||
[$X =] tie %hash, 'DB_File::Lock', $filename, $flags, $mode, $DB_BTREE, $locking;
|
||||
[$X =] tie @array, 'DB_File::Lock', $filename, $flags, $mode, $DB_RECNO, $locking;
|
||||
|
||||
# or place the DB_File arguments inside a list reference:
|
||||
[$X =] tie %hash, 'DB_File::Lock', [$filename, $flags, $mode, $DB_HASH], $locking;
|
||||
|
||||
...use the same way as DB_File for the rest of the interface...
|
||||
|
||||
=head1 DESCRIPTION
|
||||
|
||||
This module provides a wrapper for the DB_File module, adding locking.
|
||||
|
||||
When you need locking, simply use this module in place of DB_File and
|
||||
add an extra argument onto the tie command specifying if the file should
|
||||
be locked for reading or writing.
|
||||
|
||||
The alternative is to write code like:
|
||||
|
||||
open(LOCK, "<$db_filename.lock") or die;
|
||||
flock(LOCK, LOCK_SH) or die;
|
||||
tie(%db_hash, 'DB_File', $db_filename, O_RDONLY, 0600, $DB_HASH) or die;
|
||||
... then read the database ...
|
||||
untie(%db_hash);
|
||||
close(LOCK);
|
||||
|
||||
This module lets you write
|
||||
|
||||
tie(%db_hash, 'DB_File::Lock', $db_filename, O_RDONLY, 0600, $DB_HASH, 'read') or die;
|
||||
... then read the database ...
|
||||
untie(%db_hash);
|
||||
|
||||
This is better for two reasons:
|
||||
|
||||
(1) Less cumbersome to write.
|
||||
|
||||
(2) A fatal exception in the code working on the database which does
|
||||
not lead to process termination will probably not close the lockfile
|
||||
and therefore cause a dropped lock.
|
||||
|
||||
=head1 USAGE DETAILS
|
||||
|
||||
Tie to the database file by adding an additional locking argument
|
||||
to the list of arguments to be passed through to DB_File, such as:
|
||||
|
||||
tie(%db_hash, 'DB_File::Lock', $db_filename, O_RDONLY, 0600, $DB_HASH, 'read');
|
||||
|
||||
or enclose the arguments for DB_File in a list reference:
|
||||
|
||||
tie(%db_hash, 'DB_File::Lock', [$db_filename, O_RDONLY, 0600, $DB_HASH], 'read');
|
||||
|
||||
The filename used for the lockfile defaults to "$filename.lock"
|
||||
(the filename of the DB_File with ".lock" appended). Using a lockfile
|
||||
separate from the database file is recommended because it prevents weird
|
||||
interactions with the underlying database file library
|
||||
|
||||
The additional locking argument added to the tie call can be:
|
||||
|
||||
(1) "read" -- acquires a shared lock for reading
|
||||
|
||||
(2) "write" -- acquires an exclusive lock for writing
|
||||
|
||||
(3) A hash with the following keys (all optional except for the "mode"):
|
||||
|
||||
=over 4
|
||||
|
||||
=item mode
|
||||
|
||||
the locking mode, "read" or "write".
|
||||
|
||||
=item lockfile_name
|
||||
|
||||
specifies the name of the lockfile to use. Default
|
||||
is "$filename.lock". This is useful for locking multiple resources with
|
||||
the same lockfiles.
|
||||
|
||||
=item nonblocking
|
||||
|
||||
determines if the flock call on the lockfile should
|
||||
block waiting for a lock, or if it should return failure if a lock can
|
||||
not be immediately attained. If "nonblocking" is set and a lock can not
|
||||
be attained, the tie command will fail. Currently, I'm not sure how to
|
||||
differentiate this between a failure form the DB_File layer.
|
||||
|
||||
=item lockfile_mode
|
||||
|
||||
determines the mode for the sysopen call in opening
|
||||
the lockfile. The default mode will be formulated to allow anyone that
|
||||
can read or write the DB_File permission to read and write the lockfile.
|
||||
(This is because some systems may require that one have write access to
|
||||
a file to lock it for reading, I understand.) The umask will be prevented
|
||||
from applying to this mode.
|
||||
|
||||
=back
|
||||
|
||||
Note: One may import the same values from DB_File::Lock as one may import
|
||||
from DB_File.
|
||||
|
||||
=head1 GOOD LOCKING ETIQUETTE
|
||||
|
||||
To avoid locking problems, realize that it is B<critical> that you release
|
||||
the lock as soon as possible. See the lock as a "hot potato", something
|
||||
that you must work with and get rid of as quickly as possible. See the
|
||||
sections of code where you have a lock as "critical" sections. Make sure
|
||||
that you call "untie" as soon as possible.
|
||||
|
||||
It is often better to write:
|
||||
|
||||
# open database file with lock
|
||||
# work with database
|
||||
# lots of processing not related to database
|
||||
# work with database
|
||||
# close database and release lock
|
||||
|
||||
as:
|
||||
|
||||
# open database file with lock
|
||||
# work with database
|
||||
# close database and release lock
|
||||
|
||||
# lots of processing not related to database
|
||||
|
||||
# open database file with lock
|
||||
# work with database
|
||||
# close database and release lock
|
||||
|
||||
Also realize that when acquiring two locks at the same time, a deadlock
|
||||
situation can be caused.
|
||||
|
||||
You can enter a deadlock situation if two processes simultaneously try to
|
||||
acquire locks on two separate databases. Each has locked only one of
|
||||
the databases, and cannot continue without locking the second. Yet this
|
||||
will never be freed because it is locked by the other process. If your
|
||||
processes all ask for their DB files in the same order, this situation
|
||||
cannot occur.
|
||||
|
||||
=head1 OTHER LOCKING MODULES
|
||||
|
||||
There are three locking wrappers for DB_File in CPAN right now. Each one
|
||||
implements locking differently and has different goals in mind. It is
|
||||
therefore worth knowing the difference, so that you can pick the right
|
||||
one for your application.
|
||||
|
||||
Here are the three locking wrappers:
|
||||
|
||||
Tie::DB_Lock -- DB_File wrapper which creates copies of the database file
|
||||
for read access, so that you have kind of a multiversioning concurrent
|
||||
read system. However, updates are still serial. Use for databases where
|
||||
reads may be lengthy and consistency problems may occur.
|
||||
|
||||
Tie::DB_LockFile -- DB_File wrapper that has the ability to lock and
|
||||
unlock the database while it is being used. Avoids the tie-before-flock
|
||||
problem by simply re-tie-ing the database when you get or drop a
|
||||
lock. Because of the flexibility in dropping and re-acquiring the lock
|
||||
in the middle of a session, this can be massaged into a system that will
|
||||
work with long updates and/or reads if the application follows the hints
|
||||
in the POD documentation.
|
||||
|
||||
DB_File::Lock (this module) -- extremely lightweight DB_File wrapper
|
||||
that simply flocks a lockfile before tie-ing the database and drops the
|
||||
lock after the untie. Allows one to use the same lockfile for multiple
|
||||
databases to avoid deadlock problems, if desired. Use for databases where
|
||||
updates are reads are quick and simple flock locking semantics are enough.
|
||||
|
||||
(This text duplicated in the POD documentation, by the way.)
|
||||
|
||||
=head1 AUTHOR
|
||||
|
||||
David Harris <dharris@drh.net>
|
||||
|
||||
Helpful insight from Stas Bekman <stas@stason.org>
|
||||
|
||||
=head1 SEE ALSO
|
||||
|
||||
DB_File(3).
|
||||
|
||||
=cut
|
481
lib/Devscripts/Debbugs.pm
Normal file
481
lib/Devscripts/Debbugs.pm
Normal file
|
@ -0,0 +1,481 @@
|
|||
# This is Debbugs.pm from the Debian devscripts package
|
||||
#
|
||||
# Copyright (C) 2008 Adam D. Barratt
|
||||
# select() is Copyright (C) 2007 Don Armstrong
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation; either version 2 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
package Devscripts::Debbugs;
|
||||
|
||||
=head1 OPTIONS
|
||||
|
||||
=over
|
||||
|
||||
=item select [key:value ...]
|
||||
|
||||
Uses the SOAP interface to output a list of bugs which match the given
|
||||
selection requirements.
|
||||
|
||||
The following keys are allowed, and may be given multiple times.
|
||||
|
||||
=over 8
|
||||
|
||||
=item package
|
||||
|
||||
Binary package name.
|
||||
|
||||
=item source
|
||||
|
||||
Source package name.
|
||||
|
||||
=item maintainer
|
||||
|
||||
E-mail address of the maintainer.
|
||||
|
||||
=item submitter
|
||||
|
||||
E-mail address of the submitter.
|
||||
|
||||
=item severity
|
||||
|
||||
Bug severity.
|
||||
|
||||
=item status
|
||||
|
||||
Status of the bug.
|
||||
|
||||
=item tag
|
||||
|
||||
Tags applied to the bug. If I<users> is specified, may include
|
||||
usertags in addition to the standard tags.
|
||||
|
||||
=item owner
|
||||
|
||||
Bug's owner.
|
||||
|
||||
=item correspondent
|
||||
|
||||
Address of someone who sent mail to the log.
|
||||
|
||||
=item affects
|
||||
|
||||
Bugs which affect this package.
|
||||
|
||||
=item bugs
|
||||
|
||||
List of bugs to search within.
|
||||
|
||||
=item users
|
||||
|
||||
Users to use when looking up usertags.
|
||||
|
||||
=item archive
|
||||
|
||||
Whether to search archived bugs or normal bugs; defaults to 0
|
||||
(i.e. only search normal bugs). As a special case, if archive is
|
||||
'both', both archived and unarchived bugs are returned.
|
||||
|
||||
=back
|
||||
|
||||
For example, to select the set of bugs submitted by
|
||||
jrandomdeveloper@example.com and tagged wontfix, one would use
|
||||
|
||||
select("submitter:jrandomdeveloper@example.com", "tag:wontfix")
|
||||
|
||||
=back
|
||||
|
||||
=cut
|
||||
|
||||
use strict;
|
||||
use warnings;
|
||||
|
||||
my $soapurl = 'Debbugs/SOAP/1';
|
||||
our $btsurl = 'http://bugs.debian.org/';
|
||||
my @errors;
|
||||
|
||||
our $soap_timeout;
|
||||
|
||||
sub soap_timeout {
|
||||
my $timeout_arg = shift;
|
||||
if (defined $timeout_arg and $timeout_arg =~ m{^[1-9]\d*$}) {
|
||||
$soap_timeout = $timeout_arg;
|
||||
}
|
||||
}
|
||||
|
||||
sub init_soap {
|
||||
my $soapproxyurl;
|
||||
if ($btsurl =~ m%^https?://(.*)/?$%) {
|
||||
$soapproxyurl = $btsurl . '/';
|
||||
} else {
|
||||
$soapproxyurl = 'http://' . $btsurl . '/';
|
||||
}
|
||||
$soapproxyurl =~ s%//$%/%;
|
||||
$soapproxyurl .= 'cgi-bin/soap.cgi';
|
||||
my %options;
|
||||
if ($soap_timeout) {
|
||||
$options{timeout} = $soap_timeout;
|
||||
}
|
||||
my $soap = SOAP::Lite->uri($soapurl)->proxy($soapproxyurl, %options);
|
||||
|
||||
$soap->transport->env_proxy();
|
||||
$soap->on_fault(\&getSOAPError);
|
||||
|
||||
return $soap;
|
||||
}
|
||||
|
||||
my $soap_broken;
|
||||
|
||||
sub have_soap {
|
||||
return ($soap_broken ? 0 : 1) if defined $soap_broken;
|
||||
eval { require SOAP::Lite; };
|
||||
|
||||
if ($@) {
|
||||
if ($@ =~ m%^Can't locate SOAP/%) {
|
||||
$soap_broken = "the libsoap-lite-perl package is not installed";
|
||||
} else {
|
||||
$soap_broken = "couldn't load SOAP::Lite: $@";
|
||||
}
|
||||
} else {
|
||||
$soap_broken = 0;
|
||||
}
|
||||
return ($soap_broken ? 0 : 1);
|
||||
}
|
||||
|
||||
sub getSOAPError {
|
||||
my ($soap, $result) = @_;
|
||||
my $err;
|
||||
if (ref($result)) {
|
||||
$err = $result->faultstring;
|
||||
} else {
|
||||
$err = $soap->transport->status;
|
||||
}
|
||||
chomp $err;
|
||||
push @errors, $err;
|
||||
|
||||
return new SOAP::SOM;
|
||||
}
|
||||
|
||||
sub usertags {
|
||||
die "Couldn't run usertags: $soap_broken\n" unless have_soap();
|
||||
|
||||
my @args = @_;
|
||||
|
||||
my $soap = init_soap();
|
||||
my $usertags = $soap->get_usertag(@_);
|
||||
|
||||
if (@errors or not defined $usertags) {
|
||||
my $error = join("\n", @errors);
|
||||
die "Error retrieving usertags from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
my $result = $usertags->result();
|
||||
|
||||
if (@errors or not defined $result) {
|
||||
my $error = join("\n", @errors);
|
||||
die "Error retrieving usertags from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
return $result;
|
||||
}
|
||||
|
||||
sub select {
|
||||
die "Couldn't run select: $soap_broken\n" unless have_soap();
|
||||
my @args = @_;
|
||||
my %valid_keys = (
|
||||
package => 'package',
|
||||
pkg => 'package',
|
||||
src => 'src',
|
||||
source => 'src',
|
||||
maint => 'maint',
|
||||
maintainer => 'maint',
|
||||
submitter => 'submitter',
|
||||
from => 'submitter',
|
||||
status => 'status',
|
||||
tag => 'tag',
|
||||
tags => 'tag',
|
||||
usertag => 'tag',
|
||||
usertags => 'tag',
|
||||
owner => 'owner',
|
||||
dist => 'dist',
|
||||
distribution => 'dist',
|
||||
bugs => 'bugs',
|
||||
archive => 'archive',
|
||||
severity => 'severity',
|
||||
correspondent => 'correspondent',
|
||||
affects => 'affects',
|
||||
);
|
||||
my %users;
|
||||
my %search_parameters;
|
||||
my $soap = init_soap();
|
||||
for my $arg (@args) {
|
||||
my ($key, $value) = split /:/, $arg, 2;
|
||||
next unless $key;
|
||||
if (exists $valid_keys{$key}) {
|
||||
if ($valid_keys{$key} eq 'archive') {
|
||||
$search_parameters{ $valid_keys{$key} } = $value
|
||||
if $value;
|
||||
} else {
|
||||
push @{ $search_parameters{ $valid_keys{$key} } }, $value
|
||||
if $value;
|
||||
}
|
||||
} elsif ($key =~ /users?$/) {
|
||||
$users{$value} = 1 if $value;
|
||||
} else {
|
||||
warn "select(): Unrecognised key: $key\n";
|
||||
}
|
||||
}
|
||||
my %usertags;
|
||||
for my $user (keys %users) {
|
||||
my $ut = usertags($user);
|
||||
next unless defined $ut and $ut ne "";
|
||||
for my $tag (keys %{$ut}) {
|
||||
push @{ $usertags{$tag} }, @{ $ut->{$tag} };
|
||||
}
|
||||
}
|
||||
my $bugs = $soap->get_bugs(%search_parameters,
|
||||
(keys %usertags) ? (usertags => \%usertags) : ());
|
||||
|
||||
if (@errors or not defined $bugs) {
|
||||
my $error = join("\n", @errors);
|
||||
die "Error while retrieving bugs from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
my $result = $bugs->result();
|
||||
if (@errors or not defined $result) {
|
||||
my $error = join("\n", @errors);
|
||||
die "Error while retrieving bugs from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
return $result;
|
||||
}
|
||||
|
||||
sub status {
|
||||
die "Couldn't run status: $soap_broken\n" unless have_soap();
|
||||
my @args = @_;
|
||||
|
||||
my $soap = init_soap();
|
||||
|
||||
my $result = {};
|
||||
while (my @slice = splice(@args, 0, 500)) {
|
||||
my $bugs = $soap->get_status(@slice);
|
||||
|
||||
if (@errors or not defined $bugs) {
|
||||
my $error = join("\n", @errors);
|
||||
die
|
||||
"Error while retrieving bug statuses from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
my $tmp = $bugs->result();
|
||||
|
||||
if (@errors or not defined $tmp) {
|
||||
my $error = join("\n", @errors);
|
||||
die
|
||||
"Error while retrieving bug statuses from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
%$result = (%$result, %$tmp);
|
||||
}
|
||||
|
||||
return $result;
|
||||
}
|
||||
|
||||
sub versions {
|
||||
die "Couldn't run versions: $soap_broken\n" unless have_soap();
|
||||
|
||||
my @args = @_;
|
||||
my %valid_keys = (
|
||||
package => 'package',
|
||||
pkg => 'package',
|
||||
src => 'source',
|
||||
source => 'source',
|
||||
time => 'time',
|
||||
binary => 'no_source_arch',
|
||||
notsource => 'no_source_arch',
|
||||
archs => 'return_archs',
|
||||
displayarch => 'return_archs',
|
||||
);
|
||||
|
||||
my %search_parameters;
|
||||
my @archs = ();
|
||||
my @dists = ();
|
||||
|
||||
for my $arg (@args) {
|
||||
my ($key, $value) = split /:/, $arg, 2;
|
||||
$value ||= "1";
|
||||
if ($key =~ /^arch(itecture)?$/) {
|
||||
push @archs, $value;
|
||||
} elsif ($key =~ /^dist(ribution)?$/) {
|
||||
push @dists, $value;
|
||||
} elsif (exists $valid_keys{$key}) {
|
||||
$search_parameters{ $valid_keys{$key} } = $value;
|
||||
}
|
||||
}
|
||||
|
||||
$search_parameters{arch} = \@archs if @archs;
|
||||
$search_parameters{dist} = \@dists if @dists;
|
||||
|
||||
my $soap = init_soap();
|
||||
|
||||
my $versions = $soap->get_versions(%search_parameters);
|
||||
|
||||
if (@errors or not defined $versions) {
|
||||
my $error = join("\n", @errors);
|
||||
die
|
||||
"Error while retrieving package versions from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
my $result = $versions->result();
|
||||
|
||||
if (@errors or not defined $result) {
|
||||
my $error = join("\n", @errors);
|
||||
die "Error while retrieivng package versions from SOAP server: $error";
|
||||
}
|
||||
|
||||
return $result;
|
||||
}
|
||||
|
||||
sub versions_with_arch {
|
||||
die "Couldn't run versions_with_arch: $soap_broken\n" unless have_soap();
|
||||
my @args = @_;
|
||||
|
||||
my $versions = versions(@args, 'displayarch:1');
|
||||
|
||||
if (not defined $versions) {
|
||||
die "Error while retrieivng package versions from SOAP server: $@";
|
||||
}
|
||||
|
||||
return $versions;
|
||||
}
|
||||
|
||||
sub newest_bugs {
|
||||
die "Couldn't run newest_bugs: $soap_broken\n" unless have_soap();
|
||||
my $count = shift || '';
|
||||
|
||||
return if $count !~ /^\d+$/;
|
||||
|
||||
my $soap = init_soap();
|
||||
|
||||
my $bugs = $soap->newest_bugs($count);
|
||||
|
||||
if (@errors or not defined $bugs) {
|
||||
my $error = join("\n", @errors);
|
||||
die "Error while retrieving newest bug list from SOAP server: $error";
|
||||
}
|
||||
|
||||
my $result = $bugs->result();
|
||||
|
||||
if (@errors or not defined $result) {
|
||||
my $error = join("\n", @errors);
|
||||
die "Error while retrieving newest bug list from SOAP server: $error";
|
||||
}
|
||||
|
||||
return $result;
|
||||
}
|
||||
|
||||
# debbugs currently ignores the $msg_num parameter
|
||||
# but eventually it might not, so we support passing it
|
||||
|
||||
sub bug_log {
|
||||
die "Couldn't run bug_log: $soap_broken\n" unless have_soap();
|
||||
|
||||
my $bug = shift || '';
|
||||
my $message = shift;
|
||||
|
||||
return if $bug !~ /^\d+$/;
|
||||
|
||||
my $soap = init_soap();
|
||||
|
||||
my $log = $soap->get_bug_log($bug, $message);
|
||||
|
||||
if (@errors or not defined $log) {
|
||||
my $error = join("\n", @errors);
|
||||
die "Error while retrieving bug log from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
my $result = $log->result();
|
||||
|
||||
if (@errors or not defined $result) {
|
||||
my $error = join("\n", @errors);
|
||||
die "Error while retrieving bug log from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
return $result;
|
||||
}
|
||||
|
||||
sub binary_to_source {
|
||||
die "Couldn't run binary_to_source: $soap_broken\n"
|
||||
unless have_soap();
|
||||
|
||||
my $soap = init_soap();
|
||||
|
||||
my $binpkg = shift;
|
||||
my $binver = shift;
|
||||
my $arch = shift;
|
||||
|
||||
return if not defined $binpkg or not defined $binver;
|
||||
|
||||
my $mapping = $soap->binary_to_source($binpkg, $binver, $arch);
|
||||
|
||||
if (@errors or not defined $mapping) {
|
||||
my $error = join("\n", @errors);
|
||||
die
|
||||
"Error while retrieving binary to source mapping from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
my $result = $mapping->result();
|
||||
|
||||
if (@errors or not defined $result) {
|
||||
my $error = join("\n", @errors);
|
||||
die
|
||||
"Error while retrieving binary to source mapping from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
return $result;
|
||||
}
|
||||
|
||||
sub source_to_binary {
|
||||
die "Couldn't run source_to_binary: $soap_broken\n"
|
||||
unless have_soap();
|
||||
|
||||
my $soap = init_soap();
|
||||
|
||||
my $srcpkg = shift;
|
||||
my $srcver = shift;
|
||||
|
||||
return if not defined $srcpkg or not defined $srcver;
|
||||
|
||||
my $mapping = $soap->source_to_binary($srcpkg, $srcver);
|
||||
|
||||
if (@errors or not defined $mapping) {
|
||||
my $error = join("\n", @errors);
|
||||
die
|
||||
"Error while retrieving source to binary mapping from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
my $result = $mapping->result();
|
||||
|
||||
if (@errors or not defined $result) {
|
||||
my $error = join("\n", @errors);
|
||||
die
|
||||
"Error while retrieving source to binary mapping from SOAP server: $error\n";
|
||||
}
|
||||
|
||||
return $result;
|
||||
}
|
||||
|
||||
1;
|
||||
|
||||
__END__
|
||||
|
97
lib/Devscripts/JSONCache.pm
Normal file
97
lib/Devscripts/JSONCache.pm
Normal file
|
@ -0,0 +1,97 @@
|
|||
package Devscripts::JSONCache;
|
||||
|
||||
use strict;
|
||||
use JSON;
|
||||
use Moo;
|
||||
|
||||
has file => (is => 'rw', required => 1);
|
||||
|
||||
has saved => (is => 'rw');
|
||||
|
||||
has _data => (is => 'rw');
|
||||
|
||||
sub save_sec {
|
||||
my ($self, $obj) = @_;
|
||||
my $tmp = umask;
|
||||
umask 0177;
|
||||
open(my $fh, '>', $self->file) or ($self->saved(1) and die $!);
|
||||
print $fh JSON::to_json($obj);
|
||||
close $fh;
|
||||
umask $tmp;
|
||||
}
|
||||
|
||||
sub data {
|
||||
my ($self) = @_;
|
||||
return $self->_data if $self->_data;
|
||||
my $res;
|
||||
if (-r $self->file) {
|
||||
open(F, $self->file) or ($self->saved(1) and die $!);
|
||||
$res = JSON::from_json(join('', <F>) || "{}");
|
||||
close F;
|
||||
} else {
|
||||
$self->save_sec({});
|
||||
$self->saved(0);
|
||||
}
|
||||
return $self->_data($res);
|
||||
}
|
||||
|
||||
sub TIEHASH {
|
||||
my $r = shift->new({
|
||||
file => shift,
|
||||
@_,
|
||||
});
|
||||
# build data
|
||||
$r->data;
|
||||
return $r;
|
||||
}
|
||||
|
||||
sub FETCH {
|
||||
return $_[0]->data->{ $_[1] };
|
||||
}
|
||||
|
||||
sub STORE {
|
||||
$_[0]->data->{ $_[1] } = $_[2];
|
||||
}
|
||||
|
||||
sub DELETE {
|
||||
delete $_[0]->data->{ $_[1] };
|
||||
}
|
||||
|
||||
sub CLEAR {
|
||||
$_[0]->save({});
|
||||
}
|
||||
|
||||
sub EXISTS {
|
||||
return exists $_[0]->data->{ $_[1] };
|
||||
}
|
||||
|
||||
sub FIRSTKEY {
|
||||
my ($k) = sort { $a cmp $b } keys %{ $_[0]->data };
|
||||
return $k;
|
||||
}
|
||||
|
||||
sub NEXTKEY {
|
||||
my ($self, $last) = @_;
|
||||
my $i = 0;
|
||||
my @keys = map {
|
||||
return $_ if ($i);
|
||||
$i++ if ($_ eq $last);
|
||||
return ()
|
||||
}
|
||||
sort { $a cmp $b } keys %{ $_[0]->data };
|
||||
return @keys ? $keys[0] : ();
|
||||
}
|
||||
|
||||
sub SCALAR {
|
||||
return scalar %{ $_[0]->data };
|
||||
}
|
||||
|
||||
sub save {
|
||||
return if ($_[0]->saved);
|
||||
eval { $_[0]->save_sec($_[0]->data); };
|
||||
$_[0]->saved(1);
|
||||
}
|
||||
|
||||
*DESTROY = *UNTIE = *save;
|
||||
|
||||
1;
|
628
lib/Devscripts/MkOrigtargz.pm
Normal file
628
lib/Devscripts/MkOrigtargz.pm
Normal file
|
@ -0,0 +1,628 @@
|
|||
package Devscripts::MkOrigtargz;
|
||||
|
||||
use strict;
|
||||
use Cwd 'abs_path';
|
||||
use Devscripts::Compression qw/
|
||||
compression_guess_from_file
|
||||
compression_get_file_extension
|
||||
compression_get_cmdline_compress
|
||||
compression_get_cmdline_decompress
|
||||
/;
|
||||
use Devscripts::MkOrigtargz::Config;
|
||||
use Devscripts::Output;
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Utils;
|
||||
use Dpkg::Changelog::Debian;
|
||||
use Dpkg::Control::Hash;
|
||||
use Dpkg::IPC;
|
||||
use Dpkg::Version;
|
||||
use File::Copy;
|
||||
use File::Spec;
|
||||
use File::Temp qw/tempdir/;
|
||||
use Moo;
|
||||
|
||||
has config => (
|
||||
is => 'rw',
|
||||
default => sub {
|
||||
Devscripts::MkOrigtargz::Config->new->parse;
|
||||
},
|
||||
);
|
||||
|
||||
has exclude_globs => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub { $_[0]->config->exclude_file },
|
||||
);
|
||||
|
||||
has include_globs => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub { $_[0]->config->include_file },
|
||||
);
|
||||
|
||||
has status => (is => 'rw', default => sub { 0 });
|
||||
has destfile_nice => (is => 'rw');
|
||||
|
||||
our $found_comp;
|
||||
|
||||
sub do {
|
||||
my ($self) = @_;
|
||||
$self->parse_copyrights or $self->make_orig_targz;
|
||||
return $self->status;
|
||||
}
|
||||
|
||||
sub make_orig_targz {
|
||||
my ($self) = @_;
|
||||
|
||||
# Now we know what the final filename will be
|
||||
my $destfilebase = sprintf "%s_%s.%s.tar", $self->config->package,
|
||||
$self->config->version, $self->config->orig;
|
||||
my $destfiletar = sprintf "%s/%s", $self->config->directory, $destfilebase;
|
||||
my $destext
|
||||
= $self->config->compression eq 'default'
|
||||
? 'default'
|
||||
: compression_get_file_extension($self->config->compression);
|
||||
my $destfile;
|
||||
|
||||
# $upstream_tar is $upstream, unless the latter was a zip file.
|
||||
my $upstream_tar = $self->config->upstream;
|
||||
|
||||
# Remember this for the final report
|
||||
my $zipfile_deleted = 0;
|
||||
|
||||
# If the file is a zipfile, we need to create a tarfile from it.
|
||||
if ($self->config->upstream_type eq 'zip') {
|
||||
$destfile = $self->fix_dest_file($destfiletar);
|
||||
if ($self->config->signature) {
|
||||
$self->config->signature(4); # repack upstream file
|
||||
}
|
||||
|
||||
my $tempdir = tempdir("uscanXXXX", TMPDIR => 1, CLEANUP => 1);
|
||||
# Parent of the target directory should be under our control
|
||||
$tempdir .= '/repack';
|
||||
my @cmd;
|
||||
unless (mkdir $tempdir) {
|
||||
ds_die("Unable to mkdir($tempdir): $!\n");
|
||||
return $self->status(1);
|
||||
}
|
||||
@cmd = ('unzip', '-q');
|
||||
push @cmd, split ' ', $self->config->unzipopt
|
||||
if defined $self->config->unzipopt;
|
||||
push @cmd, ('-d', $tempdir, $upstream_tar);
|
||||
unless (ds_exec_no_fail(@cmd) >> 8 == 0) {
|
||||
ds_die(
|
||||
"Repacking from zip, jar, or xpi failed (could not unzip)\n");
|
||||
return $self->status(1);
|
||||
}
|
||||
|
||||
# Figure out the top-level contents of the tarball.
|
||||
# If we'd pass "." to tar we'd get the same contents, but the filenames
|
||||
# would start with ./, which is confusing later.
|
||||
# This should also be more reliable than, say, changing directories and
|
||||
# globbing.
|
||||
unless (opendir(TMPDIR, $tempdir)) {
|
||||
ds_die("Can't open $tempdir $!\n");
|
||||
return $self->status(1);
|
||||
}
|
||||
my @files = grep { $_ ne "." && $_ ne ".." } readdir(TMPDIR);
|
||||
close TMPDIR;
|
||||
|
||||
# tar it all up
|
||||
spawn(
|
||||
exec => [
|
||||
'tar', '--owner=root',
|
||||
'--group=root', '--mode=a+rX',
|
||||
'--create', '--file',
|
||||
"$destfiletar", '--directory',
|
||||
$tempdir, @files
|
||||
],
|
||||
wait_child => 1
|
||||
);
|
||||
unless (-e "$destfiletar") {
|
||||
ds_die(
|
||||
"Repacking from zip or jar to tar.$destext failed (could not create tarball)\n"
|
||||
);
|
||||
return $self->status(1);
|
||||
}
|
||||
eval {
|
||||
compress_archive($destfiletar, $destfile,
|
||||
$self->config->compression);
|
||||
};
|
||||
if ($@) {
|
||||
ds_die($@);
|
||||
return $self->status(1);
|
||||
}
|
||||
|
||||
# rename means the user did not want this file to exist afterwards
|
||||
if ($self->config->mode eq "rename") {
|
||||
unlink $upstream_tar;
|
||||
$zipfile_deleted++;
|
||||
}
|
||||
|
||||
$self->config->mode('repack');
|
||||
$upstream_tar = $destfile;
|
||||
} elsif (compression_guess_from_file($upstream_tar) =~ /^zstd?$/) {
|
||||
$self->config->force_repack(1);
|
||||
}
|
||||
|
||||
# From now on, $upstream_tar is guaranteed to be a tarball, usually
|
||||
# compressed. It is always a full (possibly relative) path, and distinct
|
||||
# from $destfile.
|
||||
|
||||
# Find out if we have to repack
|
||||
my $do_repack = 0;
|
||||
if ($self->config->repack) {
|
||||
my $comp = compression_guess_from_file($upstream_tar);
|
||||
unless ($comp) {
|
||||
ds_die("Cannot determine compression method of $upstream_tar");
|
||||
return $self->status(1);
|
||||
}
|
||||
$do_repack = (
|
||||
$comp eq 'tar'
|
||||
or ( $self->config->compression ne 'default'
|
||||
and $comp ne $self->config->compression)
|
||||
or ( $self->config->compression eq 'default'
|
||||
and $comp ne
|
||||
&Devscripts::MkOrigtargz::Config::default_compression));
|
||||
}
|
||||
|
||||
# Removing files
|
||||
my $deletecount = 0;
|
||||
my @to_delete;
|
||||
|
||||
if (@{ $self->exclude_globs }) {
|
||||
my @files;
|
||||
my $files;
|
||||
spawn(
|
||||
exec => ['tar', '-t', '-a', '-f', $upstream_tar],
|
||||
to_string => \$files,
|
||||
wait_child => 1
|
||||
);
|
||||
@files = split /^/, $files;
|
||||
chomp @files;
|
||||
|
||||
my %delete;
|
||||
# find out what to delete
|
||||
my @exclude_info;
|
||||
eval {
|
||||
@exclude_info
|
||||
= map { { glob => $_, used => 0, regex => glob_to_regex($_) } }
|
||||
@{ $self->exclude_globs };
|
||||
};
|
||||
if ($@) {
|
||||
ds_die($@);
|
||||
return $self->status(1);
|
||||
}
|
||||
for my $filename (sort @files) {
|
||||
my $last_match;
|
||||
for my $info (@exclude_info) {
|
||||
if (
|
||||
$filename
|
||||
=~ m@^(?:[^/]*/)? # Possible leading directory, ignore it
|
||||
(?:$info->{regex}) # User pattern
|
||||
(?:/.*)?$ # Possible trailing / for a directory
|
||||
@x
|
||||
) {
|
||||
if (!$last_match) {
|
||||
# if the current entry is a directory, check if it
|
||||
# matches any exclude-ignored glob
|
||||
my $ignore_this_exclude = 0;
|
||||
for my $ignore_exclude (@{ $self->include_globs }) {
|
||||
my $ignore_exclude_regex
|
||||
= glob_to_regex($ignore_exclude);
|
||||
|
||||
if ($filename =~ $ignore_exclude_regex) {
|
||||
$ignore_this_exclude = 1;
|
||||
last;
|
||||
}
|
||||
if ( $filename =~ m,/$,
|
||||
&& $ignore_exclude =~ $info->{regex}) {
|
||||
$ignore_this_exclude = 1;
|
||||
last;
|
||||
}
|
||||
}
|
||||
next if $ignore_this_exclude;
|
||||
$delete{$filename} = 1;
|
||||
}
|
||||
$last_match = $info;
|
||||
}
|
||||
}
|
||||
if (defined $last_match) {
|
||||
$last_match->{used} = 1;
|
||||
}
|
||||
}
|
||||
|
||||
for my $info (@exclude_info) {
|
||||
if (!$info->{used}) {
|
||||
ds_warn
|
||||
"No files matched excluded pattern as the last matching glob: $info->{glob}\n";
|
||||
}
|
||||
}
|
||||
|
||||
# ensure files are mentioned before the directory they live in
|
||||
# (otherwise tar complains)
|
||||
@to_delete = sort { $b cmp $a } keys %delete;
|
||||
|
||||
$deletecount = scalar(@to_delete);
|
||||
}
|
||||
|
||||
if ($deletecount or $self->config->force_repack) {
|
||||
$destfilebase = sprintf "%s_%s%s.%s.tar", $self->config->package,
|
||||
$self->config->version, $self->config->repack_suffix,
|
||||
$self->config->orig;
|
||||
$destfiletar = sprintf "%s/%s", $self->config->directory,
|
||||
$destfilebase;
|
||||
$destfile = $self->fix_dest_file($destfiletar);
|
||||
|
||||
# Zip -> tar process already created $destfile, so need to rename it
|
||||
if ($self->config->upstream_type eq 'zip') {
|
||||
move($upstream_tar, $destfile);
|
||||
$upstream_tar = $destfile;
|
||||
}
|
||||
}
|
||||
|
||||
# Actually do the unpack, remove, pack cycle
|
||||
if ($do_repack || $deletecount || $self->config->force_repack) {
|
||||
$destfile ||= $self->fix_dest_file($destfiletar);
|
||||
if ($self->config->signature) {
|
||||
$self->config->signature(4); # repack upstream file
|
||||
}
|
||||
if ($self->config->upstream_comp) {
|
||||
eval { decompress_archive($upstream_tar, $destfiletar) };
|
||||
if ($@) {
|
||||
ds_die($@);
|
||||
return $self->status(1);
|
||||
}
|
||||
} else {
|
||||
copy $upstream_tar, $destfiletar;
|
||||
}
|
||||
unlink $upstream_tar if $self->config->mode eq "rename";
|
||||
# We have to use piping because --delete is broken otherwise, as
|
||||
# documented at
|
||||
# https://www.gnu.org/software/tar/manual/html_node/delete.html
|
||||
if (@to_delete) {
|
||||
# ARG_MAX: max number of bytes exec() can handle
|
||||
my $arg_max;
|
||||
spawn(
|
||||
exec => ['getconf', 'ARG_MAX'],
|
||||
to_string => \$arg_max,
|
||||
wait_child => 1
|
||||
);
|
||||
# Under Hurd `getconf` above returns "undefined".
|
||||
# It's apparently unlimited (?), so we just use a arbitrary number.
|
||||
if ($arg_max =~ /\D/) { $arg_max = 131072; }
|
||||
# Usually NAME_MAX=255, but here we use 128 to be on the safe side.
|
||||
$arg_max = int($arg_max / 128);
|
||||
# We use this lame splice on a totally arbitrary $arg_max because
|
||||
# counting how many bytes there are in @to_delete is too
|
||||
# inefficient.
|
||||
while (my @next_n = splice @to_delete, 0, $arg_max) {
|
||||
spawn(
|
||||
exec => ['tar', '--delete', @next_n],
|
||||
from_file => $destfiletar,
|
||||
to_file => $destfiletar . ".tmp",
|
||||
wait_child => 1
|
||||
) if scalar(@next_n) > 0;
|
||||
move($destfiletar . ".tmp", $destfiletar);
|
||||
}
|
||||
}
|
||||
eval {
|
||||
compress_archive($destfiletar, $destfile,
|
||||
$self->config->compression);
|
||||
};
|
||||
if ($@) {
|
||||
ds_die $@;
|
||||
return $self->status(1);
|
||||
}
|
||||
|
||||
# Symlink no longer makes sense
|
||||
$self->config->mode('repack');
|
||||
$upstream_tar = $destfile;
|
||||
} else {
|
||||
$destfile = $self->fix_dest_file($destfiletar,
|
||||
compression_guess_from_file($upstream_tar), 1);
|
||||
}
|
||||
|
||||
# Final step: symlink, copy or rename for tarball.
|
||||
|
||||
my $same_name = abs_path($destfile) eq abs_path($self->config->upstream);
|
||||
unless ($same_name) {
|
||||
if ( $self->config->mode ne "repack"
|
||||
and $upstream_tar ne $self->config->upstream) {
|
||||
ds_die "Assertion failed";
|
||||
return $self->status(1);
|
||||
}
|
||||
|
||||
if ($self->config->mode eq "symlink") {
|
||||
my $rel
|
||||
= File::Spec->abs2rel($upstream_tar, $self->config->directory);
|
||||
symlink $rel, $destfile;
|
||||
} elsif ($self->config->mode eq "copy") {
|
||||
copy($upstream_tar, $destfile);
|
||||
} elsif ($self->config->mode eq "rename") {
|
||||
move($upstream_tar, $destfile);
|
||||
}
|
||||
}
|
||||
|
||||
# Final step: symlink, copy or rename for signature file.
|
||||
|
||||
my $destsigfile;
|
||||
if ($self->config->signature == 1) {
|
||||
$destsigfile = sprintf "%s.asc", $destfile;
|
||||
} elsif ($self->config->signature == 2) {
|
||||
$destsigfile = sprintf "%s.asc", $destfiletar;
|
||||
} elsif ($self->config->signature == 3) {
|
||||
# XXX FIXME XXX place holder
|
||||
$destsigfile = sprintf "%s.asc", $destfile;
|
||||
} else {
|
||||
# $self->config->signature == 0 or 4
|
||||
$destsigfile = "";
|
||||
}
|
||||
|
||||
if ($self->config->signature == 1 or $self->config->signature == 2) {
|
||||
my $is_openpgp_ascii_armor = 0;
|
||||
my $fh_sig;
|
||||
unless (open($fh_sig, '<', $self->config->signature_file)) {
|
||||
ds_die "Cannot open $self->{config}->{signature_file}\n";
|
||||
return $self->status(1);
|
||||
}
|
||||
while (<$fh_sig>) {
|
||||
if (m/^-----BEGIN PGP /) {
|
||||
$is_openpgp_ascii_armor = 1;
|
||||
last;
|
||||
}
|
||||
}
|
||||
close($fh_sig);
|
||||
|
||||
if (not $is_openpgp_ascii_armor) {
|
||||
my @enarmor
|
||||
= `gpg --no-options --output - --enarmor $self->{config}->{signature_file} 2>&1`;
|
||||
unless ($? == 0) {
|
||||
ds_die
|
||||
"Failed to convert $self->{config}->{signature_file} to *.asc\n";
|
||||
return $self->status(1);
|
||||
}
|
||||
unless (open(DESTSIG, '>', $destsigfile)) {
|
||||
ds_die "Failed to open $destsigfile for write $!\n";
|
||||
return $self->status(1);
|
||||
}
|
||||
foreach my $line (@enarmor) {
|
||||
next if $line =~ m/^Version:/;
|
||||
next if $line =~ m/^Comment:/;
|
||||
$line =~ s/ARMORED FILE/SIGNATURE/;
|
||||
print DESTSIG $line;
|
||||
}
|
||||
unless (close(DESTSIG)) {
|
||||
ds_die
|
||||
"Cannot write signature file $self->{config}->{signature_file}\n";
|
||||
return $self->status(1);
|
||||
}
|
||||
} else {
|
||||
if (abs_path($self->config->signature_file) ne
|
||||
abs_path($destsigfile)) {
|
||||
if ($self->config->mode eq "symlink") {
|
||||
my $rel = File::Spec->abs2rel(
|
||||
$self->config->signature_file,
|
||||
$self->config->directory
|
||||
);
|
||||
symlink $rel, $destsigfile;
|
||||
} elsif ($self->config->mode eq "copy") {
|
||||
copy($self->config->signature_file, $destsigfile);
|
||||
} elsif ($self->config->mode eq "rename") {
|
||||
move($self->config->signature_file, $destsigfile);
|
||||
} else {
|
||||
ds_die 'Strange mode="' . $self->config->mode . "\"\n";
|
||||
return $self->status(1);
|
||||
}
|
||||
}
|
||||
}
|
||||
} elsif ($self->config->signature == 3) {
|
||||
uscan_msg_raw
|
||||
"Skip adding upstream signature since upstream file has non-detached signature file.";
|
||||
} elsif ($self->config->signature == 4) {
|
||||
uscan_msg_raw
|
||||
"Skip adding upstream signature since upstream file is repacked.";
|
||||
}
|
||||
|
||||
# Final check: Is the tarball usable
|
||||
|
||||
# We are lazy and rely on Dpkg::IPC to report an error message
|
||||
# (spawn does not report back the error code).
|
||||
# We don't expect this to occur often anyways.
|
||||
my $ret = spawn(
|
||||
exec => ['tar', '--list', '--auto-compress', '--file', $destfile],
|
||||
wait_child => 1,
|
||||
to_file => '/dev/null'
|
||||
);
|
||||
|
||||
# Tell the user what we did
|
||||
|
||||
my $upstream_nice = File::Spec->canonpath($self->config->upstream);
|
||||
my $destfile_nice = File::Spec->canonpath($destfile);
|
||||
$self->destfile_nice($destfile_nice);
|
||||
|
||||
if ($same_name) {
|
||||
uscan_msg_raw "Leaving $destfile_nice where it is";
|
||||
} else {
|
||||
if ( $self->config->upstream_type eq 'zip'
|
||||
or $do_repack
|
||||
or $deletecount
|
||||
or $self->config->force_repack) {
|
||||
uscan_msg_raw
|
||||
"Successfully repacked $upstream_nice as $destfile_nice";
|
||||
} elsif ($self->config->mode eq "symlink") {
|
||||
uscan_msg_raw
|
||||
"Successfully symlinked $upstream_nice to $destfile_nice";
|
||||
} elsif ($self->config->mode eq "copy") {
|
||||
uscan_msg_raw
|
||||
"Successfully copied $upstream_nice to $destfile_nice";
|
||||
} elsif ($self->config->mode eq "rename") {
|
||||
uscan_msg_raw
|
||||
"Successfully renamed $upstream_nice to $destfile_nice";
|
||||
} else {
|
||||
ds_die 'Unknown mode ' . $self->config->mode;
|
||||
return $self->status(1);
|
||||
}
|
||||
}
|
||||
|
||||
if ($deletecount) {
|
||||
uscan_msg_raw ", deleting ${deletecount} files from it";
|
||||
}
|
||||
if ($zipfile_deleted) {
|
||||
uscan_msg_raw ", and removed the original file";
|
||||
}
|
||||
uscan_msg_raw ".\n";
|
||||
return 0;
|
||||
}
|
||||
|
||||
sub decompress_archive {
|
||||
my ($from_file, $to_file) = @_;
|
||||
my $comp = compression_guess_from_file($from_file);
|
||||
unless ($comp) {
|
||||
die("Cannot determine compression method of $from_file");
|
||||
}
|
||||
|
||||
my @cmd = compression_get_cmdline_decompress($comp);
|
||||
spawn(
|
||||
exec => \@cmd,
|
||||
from_file => $from_file,
|
||||
to_file => $to_file,
|
||||
wait_child => 1
|
||||
);
|
||||
}
|
||||
|
||||
sub compress_archive {
|
||||
my ($from_file, $to_file, $comp) = @_;
|
||||
|
||||
my @cmd = compression_get_cmdline_compress($comp);
|
||||
spawn(
|
||||
exec => \@cmd,
|
||||
from_file => $from_file,
|
||||
to_file => $to_file,
|
||||
wait_child => 1
|
||||
);
|
||||
unlink $from_file;
|
||||
}
|
||||
|
||||
# Adapted from Text::Glob::glob_to_regex_string
|
||||
sub glob_to_regex {
|
||||
my ($glob) = @_;
|
||||
|
||||
if ($glob =~ m@/$@) {
|
||||
ds_warn
|
||||
"Files-Excluded pattern ($glob) should not have a trailing /\n";
|
||||
chop($glob);
|
||||
}
|
||||
if ($glob =~ m/(?<!\\)(?:\\{2})*\\(?![\\*?])/) {
|
||||
die
|
||||
"Invalid Files-Excluded pattern ($glob), \\ can only escape \\, *, or ? characters\n";
|
||||
}
|
||||
|
||||
my ($regex, $escaping);
|
||||
for my $c ($glob =~ m/(.)/gs) {
|
||||
if (
|
||||
$c eq '.'
|
||||
|| $c eq '('
|
||||
|| $c eq ')'
|
||||
|| $c eq '|'
|
||||
|| $c eq '+'
|
||||
|| $c eq '^'
|
||||
|| $c eq '$'
|
||||
|| $c eq '@'
|
||||
|| $c eq '%'
|
||||
|| $c eq '{'
|
||||
|| $c eq '}'
|
||||
|| $c eq '['
|
||||
|| $c eq ']'
|
||||
||
|
||||
|
||||
# Escape '#' since we're using /x in the pattern match
|
||||
$c eq '#'
|
||||
) {
|
||||
$regex .= "\\$c";
|
||||
} elsif ($c eq '*') {
|
||||
$regex .= $escaping ? "\\*" : ".*";
|
||||
} elsif ($c eq '?') {
|
||||
$regex .= $escaping ? "\\?" : ".";
|
||||
} elsif ($c eq "\\") {
|
||||
if ($escaping) {
|
||||
$regex .= "\\\\";
|
||||
$escaping = 0;
|
||||
} else {
|
||||
$escaping = 1;
|
||||
}
|
||||
next;
|
||||
} else {
|
||||
$regex .= $c;
|
||||
$escaping = 0;
|
||||
}
|
||||
$escaping = 0;
|
||||
}
|
||||
|
||||
return $regex;
|
||||
}
|
||||
|
||||
sub parse_copyrights {
|
||||
my ($self) = @_;
|
||||
for my $copyright_file (@{ $self->config->copyright_file }) {
|
||||
my $data = Dpkg::Control::Hash->new();
|
||||
my $okformat
|
||||
= qr'https?://www.debian.org/doc/packaging-manuals/copyright-format/[.\d]+';
|
||||
eval {
|
||||
$data->load($copyright_file);
|
||||
1;
|
||||
} or do {
|
||||
undef $data;
|
||||
};
|
||||
if (not -e $copyright_file) {
|
||||
ds_die "File $copyright_file not found.";
|
||||
return $self->status(1);
|
||||
} elsif ($data
|
||||
&& defined $data->{format}
|
||||
&& $data->{format} =~ m@^$okformat/?$@) {
|
||||
if ($data->{ $self->config->excludestanza }) {
|
||||
push(
|
||||
@{ $self->exclude_globs },
|
||||
grep { $_ }
|
||||
split(/\s+/, $data->{ $self->config->excludestanza }));
|
||||
}
|
||||
if ($data->{ $self->config->includestanza }) {
|
||||
push(
|
||||
@{ $self->include_globs },
|
||||
grep { $_ }
|
||||
split(/\s+/, $data->{ $self->config->includestanza }));
|
||||
}
|
||||
} else {
|
||||
if (open my $file, '<', $copyright_file) {
|
||||
while (my $line = <$file>) {
|
||||
if ($line =~ m/\b$self->{config}->{excludestanza}.*:/i) {
|
||||
ds_warn "The file $copyright_file mentions "
|
||||
. $self->config->excludestanza
|
||||
. ", but its "
|
||||
. "format is not recognized. Specify Format: "
|
||||
. "https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/ "
|
||||
. "in order to remove files from the tarball with mk-origtargz.\n";
|
||||
last;
|
||||
}
|
||||
}
|
||||
close $file;
|
||||
} else {
|
||||
ds_die "Unable to read $copyright_file: $!\n";
|
||||
return $self->status(1);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
sub fix_dest_file {
|
||||
my ($self, $destfiletar, $comp, $force) = @_;
|
||||
if ($self->config->compression eq 'default' or $force) {
|
||||
$self->config->compression($comp
|
||||
|| &Devscripts::MkOrigtargz::Config::default_compression);
|
||||
}
|
||||
$comp = compression_get_file_extension($self->config->compression);
|
||||
$found_comp ||= $self->config->compression;
|
||||
return sprintf "%s.%s", $destfiletar, $comp;
|
||||
}
|
||||
|
||||
1;
|
243
lib/Devscripts/MkOrigtargz/Config.pm
Normal file
243
lib/Devscripts/MkOrigtargz/Config.pm
Normal file
|
@ -0,0 +1,243 @@
|
|||
package Devscripts::MkOrigtargz::Config;
|
||||
|
||||
use strict;
|
||||
|
||||
use Devscripts::Compression qw'compression_is_supported
|
||||
compression_guess_from_file';
|
||||
use Devscripts::Uscan::Output;
|
||||
use Dpkg::Path qw(find_command);
|
||||
use Exporter 'import';
|
||||
use Moo;
|
||||
|
||||
use constant default_compression => 'xz';
|
||||
|
||||
# regexp-assemble << END
|
||||
# tar\.gz
|
||||
# tgz
|
||||
# tar\.bz2
|
||||
# tbz2?
|
||||
# tar\.lz(?:ma)?
|
||||
# tlz(?:ma?)?
|
||||
# tar\.xz
|
||||
# txz
|
||||
# tar\.Z
|
||||
# tar
|
||||
# tar.zst
|
||||
# tar.zstd
|
||||
# END
|
||||
use constant tar_regex =>
|
||||
qr/t(?:ar(?:\.(?:lz(?:ma)?|[gx]z|bz2|Z)|.zstd?)?|lz(?:ma?)?|[gx]z|bz2?)$/;
|
||||
|
||||
extends 'Devscripts::Config';
|
||||
|
||||
# Command-line parameters
|
||||
has component => (is => 'rw');
|
||||
has compression => (is => 'rw');
|
||||
has copyright_file => (is => 'rw');
|
||||
has directory => (is => 'rw');
|
||||
has exclude_file => (is => 'rw');
|
||||
has include_file => (is => 'rw');
|
||||
has force_repack => (is => 'rw');
|
||||
has package => (is => 'rw');
|
||||
has signature => (is => 'rw');
|
||||
has signature_file => (is => 'rw');
|
||||
has repack => (is => 'rw');
|
||||
has repack_suffix => (is => 'rw');
|
||||
has unzipopt => (is => 'rw');
|
||||
has version => (is => 'rw');
|
||||
|
||||
# Internal accessors
|
||||
has mode => (is => 'rw');
|
||||
has orig => (is => 'rw', default => sub { 'orig' });
|
||||
has excludestanza => (is => 'rw', default => sub { 'Files-Excluded' });
|
||||
has includestanza => (is => 'rw', default => sub { 'Files-Included' });
|
||||
has upstream => (is => 'rw');
|
||||
has upstream_type => (is => 'rw');
|
||||
has upstream_comp => (is => 'rw');
|
||||
|
||||
use constant keys => [
|
||||
['package=s'],
|
||||
['version|v=s'],
|
||||
[
|
||||
'component|c=s',
|
||||
undef,
|
||||
sub {
|
||||
if ($_[1]) {
|
||||
$_[0]->orig("orig-$_[1]");
|
||||
$_[0]->excludestanza("Files-Excluded-$_[1]");
|
||||
$_[0]->includestanza("Files-Included-$_[1]");
|
||||
}
|
||||
1;
|
||||
|
||||
}
|
||||
],
|
||||
['directory|C=s'],
|
||||
['exclude-file=s', undef, undef, sub { [] }],
|
||||
['include-file=s', undef, undef, sub { [] }],
|
||||
['force-repack'],
|
||||
['copyright-file=s', undef, undef, sub { [] }],
|
||||
['signature=i', undef, undef, 0],
|
||||
['signature-file=s', undef, undef, ''],
|
||||
[
|
||||
'compression=s',
|
||||
undef,
|
||||
sub {
|
||||
return (0, "Unknown compression scheme $_[1]")
|
||||
unless ($_[1] eq 'default' or compression_is_supported($_[1]));
|
||||
$_[0]->compression($_[1]);
|
||||
},
|
||||
],
|
||||
['symlink', undef, \&setmode],
|
||||
['rename', undef, \&setmode],
|
||||
['copy', undef, \&setmode],
|
||||
['repack'],
|
||||
['repack-suffix|S=s', undef, undef, ''],
|
||||
['unzipopt=s'],
|
||||
];
|
||||
|
||||
use constant rules => [
|
||||
# Check --package if --version is used
|
||||
sub {
|
||||
return (
|
||||
(defined $_[0]->{package} and not defined $_[0]->{version})
|
||||
? (0, 'If you use --package, you also have to specify --version')
|
||||
: (1));
|
||||
},
|
||||
# Check that a tarball has been given and store it in $self->upstream
|
||||
sub {
|
||||
return (0, 'Please specify original tarball') unless (@ARGV == 1);
|
||||
$_[0]->upstream($ARGV[0]);
|
||||
return (
|
||||
-r $_[0]->upstream
|
||||
? (1)
|
||||
: (0, "Could not read $_[0]->{upstream}: $!"));
|
||||
},
|
||||
# Get Debian package name an version unless given
|
||||
sub {
|
||||
my ($self) = @_;
|
||||
unless (defined $self->package) {
|
||||
|
||||
# get package name
|
||||
my $c = Dpkg::Changelog::Debian->new(range => { count => 1 });
|
||||
$c->load('debian/changelog');
|
||||
if (my $msg = $c->get_parse_errors()) {
|
||||
return (0, "could not parse debian/changelog:\n$msg");
|
||||
}
|
||||
my ($entry) = @{$c};
|
||||
$self->package($entry->get_source());
|
||||
|
||||
# get version number
|
||||
unless (defined $self->version) {
|
||||
my $debversion = Dpkg::Version->new($entry->get_version());
|
||||
if ($debversion->is_native()) {
|
||||
return (0,
|
||||
"Package with native version number $debversion; "
|
||||
. "mk-origtargz makes no sense for native packages."
|
||||
);
|
||||
}
|
||||
$self->version($debversion->version());
|
||||
}
|
||||
|
||||
unshift @{ $self->copyright_file }, "debian/copyright"
|
||||
if -r "debian/copyright";
|
||||
|
||||
# set destination directory
|
||||
unless (defined $self->directory) {
|
||||
$self->directory('..');
|
||||
}
|
||||
} else {
|
||||
unless (defined $self->directory) {
|
||||
$self->directory('.');
|
||||
}
|
||||
}
|
||||
return 1;
|
||||
},
|
||||
# Get upstream type and compression
|
||||
sub {
|
||||
my ($self) = @_;
|
||||
my $mime = compression_guess_from_file($self->upstream);
|
||||
|
||||
if (defined $mime and $mime eq 'zip') {
|
||||
$self->upstream_type('zip');
|
||||
my ($prog, $pkg);
|
||||
if ($self->upstream =~ /\.xpi$/i) {
|
||||
$self->upstream_comp('xpi');
|
||||
} else {
|
||||
$self->upstream_comp('zip');
|
||||
}
|
||||
$prog = $pkg = 'unzip';
|
||||
return (0,
|
||||
"$prog binary not found."
|
||||
. " You need to install the package $pkg"
|
||||
. " to be able to repack "
|
||||
. $self->upstream_comp
|
||||
. " upstream archives.\n")
|
||||
unless (find_command($prog));
|
||||
} else {
|
||||
if ($self->upstream =~ /\.tar$/ and $mime eq 'tar') {
|
||||
$self->upstream_type('tar');
|
||||
$self->upstream_comp('');
|
||||
} elsif ($mime) {
|
||||
$self->upstream_type('tar');
|
||||
$self->upstream_comp($mime);
|
||||
unless ($self->upstream =~ tar_regex) {
|
||||
return (1,
|
||||
'Parameter '
|
||||
. $self->upstream
|
||||
. ' does not have a file extension, guessed a tarball compressed with '
|
||||
. $self->upstream_comp
|
||||
. '.');
|
||||
}
|
||||
} else {
|
||||
return (0, "Unknown compression used in $self->{upstream}");
|
||||
}
|
||||
}
|
||||
return 1;
|
||||
},
|
||||
# Default compression
|
||||
sub {
|
||||
my ($self) = @_;
|
||||
|
||||
# Case 1: format is 1.0
|
||||
if (-r 'debian/source/format') {
|
||||
open F, 'debian/source/format';
|
||||
my $str = <F>;
|
||||
unless ($str =~ /^([\d\.]+)/ and $1 >= 2.0) {
|
||||
ds_warn
|
||||
"Source format is earlier than 2.0, switch compression to gzip";
|
||||
$self->compression('gzip');
|
||||
$self->repack(1) unless ($self->upstream_comp eq 'gzip');
|
||||
}
|
||||
close F;
|
||||
} elsif (-d 'debian') {
|
||||
ds_warn "Missing debian/source/format, switch compression to gzip";
|
||||
$self->compression('gzip');
|
||||
$self->repack(1) unless ($self->upstream_comp eq 'gzip');
|
||||
} elsif ($self->upstream_type eq 'tar') {
|
||||
|
||||
# Uncompressed tar
|
||||
if (!$self->upstream_comp) {
|
||||
$self->repack(1);
|
||||
}
|
||||
}
|
||||
# Set to default. Will be changed after setting do_repack
|
||||
$self->compression('default')
|
||||
unless ($self->compression);
|
||||
return 1;
|
||||
},
|
||||
sub {
|
||||
my ($self) = @_;
|
||||
$self->{mode} ||= 'symlink';
|
||||
},
|
||||
];
|
||||
|
||||
sub setmode {
|
||||
my ($self, $nv, $kname) = @_;
|
||||
return unless ($nv);
|
||||
if (defined $self->mode and $self->mode ne $kname) {
|
||||
return (0, "--$self->{mode} and --$kname are mutually exclusive");
|
||||
}
|
||||
$self->mode($kname);
|
||||
}
|
||||
|
||||
1;
|
83
lib/Devscripts/Output.pm
Normal file
83
lib/Devscripts/Output.pm
Normal file
|
@ -0,0 +1,83 @@
|
|||
package Devscripts::Output;
|
||||
|
||||
use strict;
|
||||
use Exporter 'import';
|
||||
use File::Basename;
|
||||
use constant accept => qr/^y(?:es)?\s*$/i;
|
||||
use constant refuse => qr/^n(?:o)?\s*$/i;
|
||||
|
||||
our @EXPORT = (
|
||||
qw(ds_debug ds_extra_debug ds_verbose ds_warn ds_error
|
||||
ds_die ds_msg who_called $progname $verbose
|
||||
ds_prompt accept refuse $ds_yes)
|
||||
);
|
||||
|
||||
# ACCESSORS
|
||||
our ($verbose, $die_on_error, $ds_yes) = (0, 1, 0);
|
||||
|
||||
our $progname = basename($0);
|
||||
|
||||
sub printwarn {
|
||||
my ($msg, $w) = @_;
|
||||
chomp $msg;
|
||||
if ($w) {
|
||||
print STDERR "$msg\n";
|
||||
} else {
|
||||
print "$msg\n";
|
||||
}
|
||||
}
|
||||
|
||||
sub ds_msg {
|
||||
my $msg = $_[0];
|
||||
printwarn("$progname: $msg", $_[1]);
|
||||
}
|
||||
|
||||
sub ds_verbose {
|
||||
my $msg = $_[0];
|
||||
if ($verbose > 0) {
|
||||
printwarn("$progname info: $msg", $_[1]);
|
||||
}
|
||||
}
|
||||
|
||||
sub who_called {
|
||||
return '' unless ($verbose > 1);
|
||||
my @out = caller(1);
|
||||
return " [$out[0]: $out[2]]";
|
||||
}
|
||||
|
||||
sub ds_warn {
|
||||
my $msg = $_[0];
|
||||
printwarn("$progname warn: $msg" . who_called, 1);
|
||||
}
|
||||
|
||||
sub ds_debug {
|
||||
my $msg = $_[0];
|
||||
printwarn("$progname debug: $msg", $_[1]) if $verbose > 1;
|
||||
}
|
||||
|
||||
sub ds_extra_debug {
|
||||
my $msg = $_[0];
|
||||
printwarn("$progname debug: $msg", $_[1]) if $verbose > 2;
|
||||
}
|
||||
|
||||
*ds_die = \&ds_error;
|
||||
|
||||
sub ds_error {
|
||||
my $msg = $_[0];
|
||||
$msg = "$progname error: $msg" . who_called;
|
||||
if ($die_on_error) {
|
||||
print STDERR "$msg\n";
|
||||
exit 1;
|
||||
}
|
||||
printwarn($msg, 1);
|
||||
}
|
||||
|
||||
sub ds_prompt {
|
||||
return 'yes' if ($ds_yes > 0);
|
||||
print STDERR shift;
|
||||
my $s = <STDIN>;
|
||||
chomp $s;
|
||||
return $s;
|
||||
}
|
||||
|
||||
1;
|
307
lib/Devscripts/PackageDeps.pm
Normal file
307
lib/Devscripts/PackageDeps.pm
Normal file
|
@ -0,0 +1,307 @@
|
|||
# Based vaguely on the deprecated dpkg-perl package modules
|
||||
# Dpkg::Package::List and Dpkg::Package::Package.
|
||||
# This module creates an object which holds package names and dependencies
|
||||
# (just Depends and Pre-Depends).
|
||||
# It can also calculate the total set of subdependencies using the
|
||||
# fulldepends method.
|
||||
#
|
||||
# Copyright 2002 Julian Gilbey <jdg@debian.org>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation; either version 2 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
package Devscripts::PackageDeps;
|
||||
use strict;
|
||||
use Carp;
|
||||
use Dpkg::Control;
|
||||
use Dpkg::IPC;
|
||||
use FileHandle;
|
||||
require 5.006_000;
|
||||
|
||||
# This reads in a package file list, such as /var/lib/dpkg/status,
|
||||
# and parses it. Using /var/lib/dpkg/status is deprecated in favor of
|
||||
# fromStatus().
|
||||
|
||||
# Syntax: Devscripts::PackageDeps->new($filename)
|
||||
|
||||
sub new ($$) {
|
||||
my $this = shift;
|
||||
my $class = ref($this) || $this;
|
||||
my $filename = shift;
|
||||
|
||||
my $self = {};
|
||||
|
||||
if (!defined $filename) {
|
||||
croak("requires filename as parameter");
|
||||
}
|
||||
|
||||
bless($self, $class);
|
||||
|
||||
my $fh = FileHandle->new($filename, 'r');
|
||||
unless (defined $fh) {
|
||||
croak("Unable to load $filename: $!");
|
||||
}
|
||||
$self->parse($fh, $filename);
|
||||
$fh->close or croak("Problems encountered reading $filename: $!");
|
||||
|
||||
return $self;
|
||||
}
|
||||
|
||||
# This reads in dpkg's status information and parses it.
|
||||
|
||||
# Syntax: Devscripts::PackageDeps->fromStatus()
|
||||
|
||||
sub fromStatus ($) {
|
||||
my $this = shift;
|
||||
my $class = ref($this) || $this;
|
||||
|
||||
my $self = {};
|
||||
|
||||
bless($self, $class);
|
||||
|
||||
my $fh = FileHandle->new;
|
||||
my $pid = spawn(
|
||||
exec => ['dpkg', '--status'],
|
||||
to_pipe => $fh
|
||||
);
|
||||
unless (defined $pid) {
|
||||
croak("Unable to run 'dpkg --status': $!");
|
||||
}
|
||||
|
||||
$self->parse($fh, 'dpkg --status');
|
||||
|
||||
wait_child($pid, cmdline => 'dpkg --status', nocheck => 1);
|
||||
|
||||
return $self;
|
||||
}
|
||||
|
||||
# Internal functions
|
||||
|
||||
my $multiarch;
|
||||
|
||||
sub multiarch () {
|
||||
if (!defined $multiarch) {
|
||||
$multiarch
|
||||
= (system('dpkg --assert-multi-arch >/dev/null 2>&1') >> 8) == 0;
|
||||
}
|
||||
return $multiarch;
|
||||
}
|
||||
|
||||
sub parse ($$$) {
|
||||
my $self = shift;
|
||||
my $fh = shift;
|
||||
my $filename = shift;
|
||||
|
||||
my $ctrl;
|
||||
PACKAGE_ENTRY:
|
||||
while (defined($ctrl = Dpkg::Control->new(type => CTRL_FILE_STATUS))
|
||||
&& $ctrl->parse($fh, $filename)) {
|
||||
|
||||
# So we've got a package
|
||||
my $pkg = $ctrl->{Package};
|
||||
my @deps = ();
|
||||
|
||||
if ($ctrl->{Status} =~ /^\S+\s+\S+\s+(\S+)$/) {
|
||||
my $status = $1;
|
||||
unless ($status eq 'installed' or $status eq 'unpacked') {
|
||||
undef $ctrl;
|
||||
next PACKAGE_ENTRY;
|
||||
}
|
||||
}
|
||||
|
||||
for my $dep (qw(Depends Pre-Depends)) {
|
||||
if (exists $ctrl->{$dep}) {
|
||||
my $value = $ctrl->{$dep};
|
||||
$value =~ s/\([^)]+\)//g; # ignore versioning information
|
||||
$value =~ tr/ \t//d; # remove spaces
|
||||
my @dep_pkgs = split /,/, $value;
|
||||
foreach my $dep_pkg (@dep_pkgs) {
|
||||
my @dep_pkg_alts = split /\|/, $dep_pkg;
|
||||
if (@dep_pkg_alts == 1) { push @deps, $dep_pkg_alts[0]; }
|
||||
else { push @deps, \@dep_pkg_alts; }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
$self->{$pkg} = \@deps;
|
||||
if ($ctrl->{Architecture} ne 'all' && multiarch) {
|
||||
my $arch = $ctrl->{Architecture};
|
||||
@deps = map { "$_:$arch" } @deps;
|
||||
$self->{"$pkg:$arch"} = \@deps;
|
||||
}
|
||||
undef $ctrl;
|
||||
}
|
||||
}
|
||||
|
||||
# Get direct dependency information for a specified package
|
||||
# Returns an array or array ref depending on context
|
||||
|
||||
# Syntax: $obj->dependencies($package)
|
||||
|
||||
sub dependencies ($$) {
|
||||
my $self = shift;
|
||||
my $pkg = shift;
|
||||
|
||||
if (!defined $pkg) {
|
||||
croak("requires package as parameter");
|
||||
}
|
||||
|
||||
if (!exists $self->{$pkg}) {
|
||||
return undef;
|
||||
}
|
||||
|
||||
return wantarray ? @{ $self->{$pkg} } : $self->{$pkg};
|
||||
}
|
||||
|
||||
# Get full dependency information for a specified package or packages,
|
||||
# including the packages themselves.
|
||||
#
|
||||
# This only follows the first of sets of alternatives, and ignores
|
||||
# dependencies on packages which do not appear to exist.
|
||||
# Returns an array or array ref
|
||||
|
||||
# Syntax: $obj->full_dependencies(@packages)
|
||||
|
||||
sub full_dependencies ($@) {
|
||||
my $self = shift;
|
||||
my @toprocess = @_;
|
||||
my %deps;
|
||||
|
||||
return wantarray ? () : [] unless @toprocess;
|
||||
|
||||
while (@toprocess) {
|
||||
my $next = shift @toprocess;
|
||||
$next = $$next[0] if ref $next;
|
||||
# Already seen?
|
||||
next if exists $deps{$next};
|
||||
# Known package?
|
||||
next unless exists $self->{$next};
|
||||
# Mark it as a dependency
|
||||
$deps{$next} = 1;
|
||||
push @toprocess, @{ $self->{$next} };
|
||||
}
|
||||
|
||||
return wantarray ? keys %deps : [keys %deps];
|
||||
}
|
||||
|
||||
# Given a set of packages, find a minimal set with respect to the
|
||||
# pre-partial order of dependency.
|
||||
#
|
||||
# This is vaguely based on the dpkg-mindep script by
|
||||
# Bill Allombert <ballombe@debian.org>. It only follows direct
|
||||
# dependencies, and does not attempt to follow indirect dependencies.
|
||||
#
|
||||
# This respects the all packages in sets of alternatives.
|
||||
# Returns: (\@minimal_set, \%dependencies)
|
||||
# where the %dependencies hash is of the form
|
||||
# non-minimal package => depending package
|
||||
|
||||
# Syntax: $obj->min_dependencies(@packages)
|
||||
|
||||
sub min_dependencies ($@) {
|
||||
my $self = shift;
|
||||
my @pkgs = @_;
|
||||
my @min_pkgs = ();
|
||||
my %dep_pkgs = ();
|
||||
|
||||
return (\@min_pkgs, \%dep_pkgs) unless @pkgs;
|
||||
|
||||
# We create a directed graph: the %forward_deps hash records arrows
|
||||
# pkg A depends on pkg B; the %reverse_deps hash records the
|
||||
# reverse arrows
|
||||
my %forward_deps;
|
||||
my %reverse_deps;
|
||||
|
||||
# Initialise
|
||||
foreach my $pkg (@pkgs) {
|
||||
$forward_deps{$pkg} = {};
|
||||
$reverse_deps{$pkg} = {};
|
||||
}
|
||||
|
||||
foreach my $pkg (@pkgs) {
|
||||
next unless exists $self->{$pkg};
|
||||
my @pkg_deps = @{ $self->{$pkg} };
|
||||
while (@pkg_deps) {
|
||||
my $dep = shift @pkg_deps;
|
||||
if (ref $dep) {
|
||||
unshift @pkg_deps, @$dep;
|
||||
next;
|
||||
}
|
||||
if (exists $forward_deps{$dep}) {
|
||||
$forward_deps{$pkg}{$dep} = 1;
|
||||
$reverse_deps{$dep}{$pkg} = 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
# We start removing packages from the tree if they have no dependencies.
|
||||
# Once we have no such packages left, we must have mutual or cyclic
|
||||
# dependencies, so we pick a random one to remove and then start again.
|
||||
# We continue this until there are no packages left in the graph.
|
||||
PACKAGE:
|
||||
while (scalar keys %forward_deps) {
|
||||
foreach my $pkg (keys %forward_deps) {
|
||||
if (scalar keys %{ $forward_deps{$pkg} } == 0) {
|
||||
# Great, no dependencies!
|
||||
if (scalar keys %{ $reverse_deps{$pkg} }) {
|
||||
# This package is depended upon, so we can remove it
|
||||
# with care
|
||||
foreach my $dep_pkg (keys %{ $reverse_deps{$pkg} }) {
|
||||
# take the first mentioned package for the
|
||||
# recorded list of depended-upon packages
|
||||
$dep_pkgs{$pkg} ||= $dep_pkg;
|
||||
delete $forward_deps{$dep_pkg}{$pkg};
|
||||
}
|
||||
} else {
|
||||
# This package is not depended upon, so it must
|
||||
# go into our mindep list
|
||||
push @min_pkgs, $pkg;
|
||||
}
|
||||
# Now remove this node
|
||||
delete $forward_deps{$pkg};
|
||||
delete $reverse_deps{$pkg};
|
||||
next PACKAGE;
|
||||
}
|
||||
}
|
||||
|
||||
# Oh, we didn't find any package which didn't depend on any other.
|
||||
# We'll pick a random one, then. At least *some* package must
|
||||
# be depended upon in this situation; let's pick one of these.
|
||||
foreach my $pkg (keys %forward_deps) {
|
||||
next unless scalar keys %{ $reverse_deps{$pkg} } > 0;
|
||||
|
||||
foreach my $dep_pkg (keys %{ $forward_deps{$pkg} }) {
|
||||
delete $reverse_deps{$dep_pkg}{$pkg};
|
||||
}
|
||||
foreach my $dep_pkg (keys %{ $reverse_deps{$pkg} }) {
|
||||
# take the first mentioned package for the
|
||||
# recorded list of depended-upon packages
|
||||
$dep_pkgs{$pkg} ||= $dep_pkg;
|
||||
delete $forward_deps{$dep_pkg}{$pkg};
|
||||
}
|
||||
|
||||
# Now remove this node
|
||||
delete $forward_deps{$pkg};
|
||||
delete $reverse_deps{$pkg};
|
||||
# And onto the next package
|
||||
goto PACKAGE;
|
||||
}
|
||||
|
||||
# Ouch! We shouldn't ever get here
|
||||
croak("Couldn't determine mindeps; this can't happen!");
|
||||
}
|
||||
|
||||
return (\@min_pkgs, \%dep_pkgs);
|
||||
}
|
||||
|
||||
1;
|
313
lib/Devscripts/Packages.pm
Normal file
313
lib/Devscripts/Packages.pm
Normal file
|
@ -0,0 +1,313 @@
|
|||
#! /usr/bin/perl
|
||||
|
||||
# Copyright Bill Allombert <ballombe@debian.org> 2001.
|
||||
# Modifications copyright 2002 Julian Gilbey <jdg@debian.org>
|
||||
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation; either version 2 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
package Devscripts::Packages;
|
||||
|
||||
use strict;
|
||||
use warnings;
|
||||
use Carp;
|
||||
use Dpkg::Control;
|
||||
use Dpkg::IPC;
|
||||
use FileHandle;
|
||||
|
||||
BEGIN {
|
||||
use Exporter ();
|
||||
use vars qw(@EXPORT @ISA %EXPORT_TAGS);
|
||||
@EXPORT
|
||||
= qw(PackagesToFiles FilesToPackages PackagesMatch InstalledPackages);
|
||||
@ISA = qw(Exporter);
|
||||
%EXPORT_TAGS = ();
|
||||
}
|
||||
|
||||
=head1 NAME
|
||||
|
||||
Devscript::Packages - Interface to the dpkg package database
|
||||
|
||||
=head1 SYNOPSIS
|
||||
|
||||
use Devscript::Packages;
|
||||
|
||||
@files=PackagesToFiles(@packages);
|
||||
|
||||
@packages=FilesToPackages(@files);
|
||||
|
||||
@packages=PackagesMatch($regexp);
|
||||
|
||||
$packages_hashref=InstalledPackages($sources);
|
||||
|
||||
=head1 DESCRIPTION
|
||||
|
||||
|
||||
PackagesToFiles: Return a list of files contained in a list of packages.
|
||||
|
||||
FilesToPackages: Return a list of packages containing at least
|
||||
one file in a list of files, taking care to handle diversions correctly.
|
||||
|
||||
PackagesMatch: list of packages whose status match regexp.
|
||||
|
||||
InstalledPackages: ref to hash with keys being installed packages
|
||||
(status = install ok installed). If $sources is true, then include
|
||||
the corresponding source packages as well in the list.
|
||||
|
||||
=cut
|
||||
|
||||
my $multiarch;
|
||||
|
||||
sub multiarch () {
|
||||
if (!defined $multiarch) {
|
||||
$multiarch
|
||||
= (system('dpkg --assert-multi-arch >/dev/null 2>&1') >> 8) == 0;
|
||||
}
|
||||
return $multiarch;
|
||||
}
|
||||
|
||||
# input: a list of packages names.
|
||||
# output: list of files they contain.
|
||||
|
||||
sub PackagesToFiles (@) {
|
||||
return () if @_ == 0;
|
||||
|
||||
my %files = ();
|
||||
|
||||
# We fork and use an exec, so that we don't have to worry how long an
|
||||
# input string the shell can handle.
|
||||
|
||||
my $pid;
|
||||
my $sleep_count = 0;
|
||||
do {
|
||||
$pid = open(DPKG, "-|");
|
||||
unless (defined $pid) {
|
||||
carp("cannot fork: $!");
|
||||
croak("bailing out") if $sleep_count++ > 6;
|
||||
sleep 10;
|
||||
}
|
||||
} until defined $pid;
|
||||
|
||||
if ($pid) { # parent
|
||||
while (<DPKG>) {
|
||||
chomp;
|
||||
next if /^package diverts others to: / or -d $_;
|
||||
$files{$_} = 1;
|
||||
}
|
||||
close DPKG or croak("dpkg -L failed: $!");
|
||||
} else { # child
|
||||
# We must use C locale, else diversion messages may be translated.
|
||||
$ENV{'LC_ALL'} = 'C';
|
||||
exec('dpkg', '-L', @_)
|
||||
or croak("can't exec dpkg -L: $!");
|
||||
}
|
||||
|
||||
return keys %files;
|
||||
}
|
||||
|
||||
# This basically runs a dpkg -S with a few bells and whistles
|
||||
#
|
||||
# input: a list of files.
|
||||
# output: list of packages they belong to.
|
||||
|
||||
sub FilesToPackages (@) {
|
||||
return () if @_ == 0;
|
||||
|
||||
# We fork and use an exec, so that we don't have to worry how long an
|
||||
# input string the shell can handle.
|
||||
|
||||
my @dpkg_out;
|
||||
my $pid;
|
||||
my $sleep_count = 0;
|
||||
do {
|
||||
$pid = open(DPKG, "-|");
|
||||
unless (defined $pid) {
|
||||
carp("cannot fork: $!");
|
||||
croak("bailing out") if $sleep_count++ > 6;
|
||||
sleep 10;
|
||||
}
|
||||
} until defined $pid;
|
||||
|
||||
if ($pid) { # parent
|
||||
while (<DPKG>) {
|
||||
# We'll process it later
|
||||
chomp;
|
||||
push @dpkg_out, $_;
|
||||
}
|
||||
if (!close DPKG) {
|
||||
# exit status of 1 just indicates unrecognised files
|
||||
if ($? & 0xff || $? >> 8 != 1) {
|
||||
carp( "warning: dpkg -S exited with signal "
|
||||
. ($? & 0xff)
|
||||
. " and status "
|
||||
. ($? >> 8));
|
||||
}
|
||||
}
|
||||
} else { # child
|
||||
# We must use C locale, else diversion messages may be translated.
|
||||
$ENV{'LC_ALL'} = 'C';
|
||||
open STDERR, '>& STDOUT'; # Capture STDERR as well
|
||||
exec('dpkg', '-S', @_)
|
||||
or croak("can't exec dpkg -S: $!");
|
||||
}
|
||||
|
||||
my %packages = ();
|
||||
foreach my $curfile (@_) {
|
||||
my $pkgfrom;
|
||||
foreach my $line (@dpkg_out) {
|
||||
# We want to handle diversions nicely.
|
||||
# Ignore local diversions
|
||||
if ($line =~ /^local diversion from: /) {
|
||||
# Do nothing
|
||||
} elsif ($line =~ /^local diversion to: (.+)$/) {
|
||||
if ($curfile eq $1) {
|
||||
last;
|
||||
}
|
||||
} elsif ($line =~ /^diversion by (\S+) from: (.+)$/) {
|
||||
if ($curfile eq $2) {
|
||||
# So the file we're looking has been diverted
|
||||
$pkgfrom = $1;
|
||||
}
|
||||
} elsif ($line =~ /^diversion by (\S+) to: (.+)$/) {
|
||||
if ($curfile eq $2) {
|
||||
# So the file we're looking is a diverted file
|
||||
# We shouldn't see it again
|
||||
$packages{$1} = 1;
|
||||
last;
|
||||
}
|
||||
} elsif ($line =~ /^dpkg: \Q$curfile\E not found\.$/) {
|
||||
last;
|
||||
} elsif ($line
|
||||
=~ /^dpkg-query: no path found matching pattern \Q$curfile\E\.$/
|
||||
) {
|
||||
last;
|
||||
} elsif ($line =~ /^(.*): \Q$curfile\E$/) {
|
||||
my @pkgs = split /,\s+/, $1;
|
||||
if (@pkgs == 1 || !grep /:/, @pkgs) {
|
||||
# Only one package, or all Multi-Arch packages
|
||||
map { $packages{$_} = 1 } @pkgs;
|
||||
} else {
|
||||
# We've got a file which has been diverted by some package
|
||||
# or is Multi-Arch and so is listed in two packages. If it
|
||||
# was diverted, the *diverting* package is the one with the
|
||||
# file that was actually used.
|
||||
my $found = 0;
|
||||
foreach my $pkg (@pkgs) {
|
||||
if ($pkg eq $pkgfrom) {
|
||||
$packages{$pkgfrom} = 1;
|
||||
$found = 1;
|
||||
last;
|
||||
}
|
||||
}
|
||||
if (!$found) {
|
||||
carp(
|
||||
"Something wicked happened to the output of dpkg -S $curfile"
|
||||
);
|
||||
}
|
||||
}
|
||||
# Prepare for the next round
|
||||
last;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return keys %packages;
|
||||
}
|
||||
|
||||
# Return a list of packages whose status entries match a given pattern
|
||||
|
||||
sub PackagesMatch ($) {
|
||||
my $match = $_[0];
|
||||
my @matches = ();
|
||||
|
||||
my $fout = FileHandle->new;
|
||||
my $pid = spawn(
|
||||
exec => ['dpkg', '--status'],
|
||||
to_pipe => $fout
|
||||
);
|
||||
unless (defined $pid) {
|
||||
croak("Unable to run \"dpkg --status\": $!");
|
||||
}
|
||||
|
||||
my $ctrl;
|
||||
while (defined($ctrl = Dpkg::Control->new())
|
||||
&& $ctrl->parse($fout, 'dpkg --status')) {
|
||||
if ("$ctrl" =~ m/$match/m) {
|
||||
my $package = $ctrl->{Package};
|
||||
if ($ctrl->{Architecture} ne 'all' && multiarch) {
|
||||
$package .= ":$ctrl->{Architecture}";
|
||||
}
|
||||
push @matches, $package;
|
||||
}
|
||||
undef $ctrl;
|
||||
}
|
||||
|
||||
wait_child($pid, cmdline => 'dpkg --status', nocheck => 1);
|
||||
return @matches;
|
||||
}
|
||||
|
||||
# Which packages are installed (Package and Source)?
|
||||
|
||||
sub InstalledPackages ($) {
|
||||
my $source = $_[0];
|
||||
|
||||
my $fout = FileHandle->new;
|
||||
my $pid = spawn(
|
||||
exec => ['dpkg', '--status'],
|
||||
to_pipe => $fout
|
||||
);
|
||||
unless (defined $pid) {
|
||||
croak("Unable to run \"dpkg --status\": $!");
|
||||
}
|
||||
|
||||
my $ctrl;
|
||||
my %matches;
|
||||
while (defined($ctrl = Dpkg::Control->new(type => CTRL_FILE_STATUS))
|
||||
&& $ctrl->parse($fout, 'dpkg --status')) {
|
||||
if ($ctrl->{Status} !~ /^install\s+ok\s+installed$/) {
|
||||
next;
|
||||
}
|
||||
if ($source) {
|
||||
if (exists $ctrl->{Source}) {
|
||||
$matches{ $ctrl->{Source} } = 1;
|
||||
}
|
||||
}
|
||||
if (exists $ctrl->{Package}) {
|
||||
$matches{ $ctrl->{Package} } = 1;
|
||||
if ($ctrl->{Architecture} ne 'all' && multiarch) {
|
||||
$matches{"$ctrl->{Package}:$ctrl->{Architecture}"} = 1;
|
||||
}
|
||||
}
|
||||
undef $ctrl;
|
||||
}
|
||||
|
||||
wait_child($pid, cmdline => 'dpkg --status', nocheck => 1);
|
||||
|
||||
return \%matches;
|
||||
}
|
||||
|
||||
1;
|
||||
|
||||
=head1 AUTHOR
|
||||
|
||||
Bill Allombert <ballombe@debian.org>
|
||||
|
||||
=head1 COPYING
|
||||
|
||||
Copyright 2001 Bill Allombert <ballombe@debian.org>
|
||||
Modifications copyright 2002 Julian Gilbey <jdg@debian.org>
|
||||
dpkg-depcheck is free software, covered by the GNU General Public License, and
|
||||
you are welcome to change it and/or distribute copies of it under
|
||||
certain conditions. There is absolutely no warranty for dpkg-depcheck.
|
||||
|
||||
=cut
|
427
lib/Devscripts/Salsa.pm
Executable file
427
lib/Devscripts/Salsa.pm
Executable file
|
@ -0,0 +1,427 @@
|
|||
package Devscripts::Salsa;
|
||||
|
||||
=head1 NAME
|
||||
|
||||
Devscripts::Salsa - salsa(1) base object
|
||||
|
||||
=head1 SYNOPSIS
|
||||
|
||||
use Devscripts::Salsa;
|
||||
exit Devscripts::Salsa->new->run
|
||||
|
||||
=head1 DESCRIPTION
|
||||
|
||||
Devscripts::Salsa provides salsa(1) command launcher and some common utilities
|
||||
methods.
|
||||
|
||||
=cut
|
||||
|
||||
use strict;
|
||||
|
||||
use Devscripts::Output;
|
||||
use Devscripts::Salsa::Config;
|
||||
|
||||
BEGIN {
|
||||
eval "use GitLab::API::v4;use GitLab::API::v4::Constants qw(:all)";
|
||||
if ($@) {
|
||||
print STDERR "You must install GitLab::API::v4\n";
|
||||
exit 1;
|
||||
}
|
||||
}
|
||||
use Moo;
|
||||
use File::Basename;
|
||||
use File::Path qw(make_path);
|
||||
|
||||
# Command aliases
|
||||
use constant cmd_aliases => {
|
||||
# Alias => Filename -> ./lib/Devscripts/Salsa/*.pm
|
||||
# Preferred terminology
|
||||
check_projects => 'check_repo',
|
||||
create_project => 'create_repo',
|
||||
delete_project => 'del_repo',
|
||||
delete_user => 'del_user',
|
||||
list_projects => 'list_repos',
|
||||
list_users => 'group',
|
||||
search_groups => 'search_group',
|
||||
search_projects => 'search_project',
|
||||
search_users => 'search_user',
|
||||
update_projects => 'update_repo',
|
||||
|
||||
# Catch possible typo (As able to-do multiple items at once)
|
||||
list_user => 'group',
|
||||
check_project => 'check_repo',
|
||||
list_project => 'list_repos',
|
||||
update_project => 'update_repo',
|
||||
|
||||
# Abbreviation
|
||||
co => 'checkout',
|
||||
ls => 'list_repos',
|
||||
mr => 'merge_request',
|
||||
mrs => 'merge_requests',
|
||||
schedule => 'pipeline_schedule',
|
||||
schedules => 'pipeline_schedules',
|
||||
|
||||
# Legacy
|
||||
search => 'search_project',
|
||||
search_repo => 'search_project',
|
||||
};
|
||||
|
||||
=head1 ACCESSORS
|
||||
|
||||
=over
|
||||
|
||||
=item B<config> : Devscripts::Salsa::Config object (parsed)
|
||||
|
||||
=cut
|
||||
|
||||
has config => (
|
||||
is => 'rw',
|
||||
default => sub { Devscripts::Salsa::Config->new->parse },
|
||||
);
|
||||
|
||||
=item B<cache> : Devscripts::JSONCache object
|
||||
|
||||
=cut
|
||||
|
||||
# File cache to avoid polling GitLab too much
|
||||
# (used to store ids, paths and names)
|
||||
has _cache => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub {
|
||||
return {} unless ($_[0]->config->cache_file);
|
||||
my %h;
|
||||
eval {
|
||||
my ($cache_file, $cache_dir) = fileparse $_[0]->config->cache_file;
|
||||
if (!-d $cache_dir) {
|
||||
make_path $cache_dir;
|
||||
}
|
||||
require Devscripts::JSONCache;
|
||||
tie %h, 'Devscripts::JSONCache', $_[0]->config->cache_file;
|
||||
ds_debug "Cache opened";
|
||||
};
|
||||
if ($@) {
|
||||
ds_verbose "Unable to create cache object: $@";
|
||||
return {};
|
||||
}
|
||||
return \%h;
|
||||
},
|
||||
);
|
||||
has cache => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub {
|
||||
$_[0]->_cache->{ $_[0]->config->api_url } //= {};
|
||||
return $_[0]->_cache->{ $_[0]->config->api_url };
|
||||
},
|
||||
);
|
||||
|
||||
# In memory cache (used to avoid querying the project id twice when using
|
||||
# update_safe
|
||||
has projectCache => (
|
||||
is => 'rw',
|
||||
default => sub { {} },
|
||||
);
|
||||
|
||||
=item B<api>: GitLab::API::v4 object
|
||||
|
||||
=cut
|
||||
|
||||
has api => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub {
|
||||
my $r = GitLab::API::v4->new(
|
||||
url => $_[0]->config->api_url,
|
||||
(
|
||||
$_[0]->config->private_token
|
||||
? (private_token => $_[0]->config->private_token)
|
||||
: ()
|
||||
),
|
||||
);
|
||||
$r or ds_die "Unable to create GitLab::API::v4 object";
|
||||
return $r;
|
||||
},
|
||||
);
|
||||
|
||||
=item User or group in use
|
||||
|
||||
=over
|
||||
|
||||
=item B<username>
|
||||
|
||||
=item B<user_id>
|
||||
|
||||
=item B<group_id>
|
||||
|
||||
=item B<group_path>
|
||||
|
||||
=back
|
||||
|
||||
=cut
|
||||
|
||||
# Accessors that resolve names, ids or paths
|
||||
has username => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub { $_[0]->id2username });
|
||||
|
||||
has user_id => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub {
|
||||
$_[0]->config->user_id || $_[0]->username2id;
|
||||
},
|
||||
);
|
||||
|
||||
has group_id => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub { $_[0]->config->group_id || $_[0]->group2id },
|
||||
);
|
||||
|
||||
has group_path => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub {
|
||||
my ($self) = @_;
|
||||
return undef unless ($self->group_id);
|
||||
return $self->cache->{group_path}->{ $self->{group_id} }
|
||||
if $self->cache->{group_path}->{ $self->{group_id} };
|
||||
return $self->{group_path} if ($self->{group_path}); # Set if --group
|
||||
eval {
|
||||
$self->{group_path}
|
||||
= $self->api->group_without_projects($self->group_id)
|
||||
->{full_path};
|
||||
$self->cache->{group_path}->{ $self->{group_id} }
|
||||
= $self->{group_path};
|
||||
};
|
||||
if ($@) {
|
||||
ds_verbose $@;
|
||||
ds_warn "Unexistent group " . $self->group_id;
|
||||
return undef;
|
||||
}
|
||||
return $self->{group_path};
|
||||
},
|
||||
);
|
||||
|
||||
=back
|
||||
|
||||
=head1 METHODS
|
||||
|
||||
=over
|
||||
|
||||
=item B<run>: main method, load and run command and return Unix result code.
|
||||
|
||||
=cut
|
||||
|
||||
sub run {
|
||||
my ($self, $args) = @_;
|
||||
binmode STDOUT, ':utf8';
|
||||
|
||||
# Check group or user id
|
||||
my $command = $self->config->command;
|
||||
if (my $tmp = cmd_aliases->{$command}) {
|
||||
$command = $tmp;
|
||||
}
|
||||
eval { with "Devscripts::Salsa::$command" };
|
||||
if ($@) {
|
||||
ds_verbose $@;
|
||||
ds_die "Unknown command $command";
|
||||
return 1;
|
||||
}
|
||||
return $self->$command(@ARGV);
|
||||
}
|
||||
|
||||
=back
|
||||
|
||||
=head2 Utilities
|
||||
|
||||
=over
|
||||
|
||||
=item B<levels_name>, B<levels_code>: convert strings to GitLab level codes
|
||||
(owner, maintainer, developer, reporter and guest)
|
||||
|
||||
=cut
|
||||
|
||||
sub levels_name {
|
||||
my $res = {
|
||||
|
||||
# needs GitLab::API::v4::Constants 0.11
|
||||
# no_access => $GITLAB_ACCESS_LEVEL_NO_ACCESS,
|
||||
guest => $GITLAB_ACCESS_LEVEL_GUEST,
|
||||
reporter => $GITLAB_ACCESS_LEVEL_REPORTER,
|
||||
developer => $GITLAB_ACCESS_LEVEL_DEVELOPER,
|
||||
maintainer => $GITLAB_ACCESS_LEVEL_MASTER,
|
||||
owner => $GITLAB_ACCESS_LEVEL_OWNER,
|
||||
}->{ $_[1] };
|
||||
ds_die "Unknown access level '$_[1]'" unless ($res);
|
||||
return $res;
|
||||
}
|
||||
|
||||
sub levels_code {
|
||||
return {
|
||||
$GITLAB_ACCESS_LEVEL_GUEST => 'guest',
|
||||
$GITLAB_ACCESS_LEVEL_REPORTER => 'reporter',
|
||||
$GITLAB_ACCESS_LEVEL_DEVELOPER => 'developer',
|
||||
$GITLAB_ACCESS_LEVEL_MASTER => 'maintainer',
|
||||
$GITLAB_ACCESS_LEVEL_OWNER => 'owner',
|
||||
}->{ $_[1] };
|
||||
}
|
||||
|
||||
=item B<username2id>, B<id2username>: convert username to an id an reverse
|
||||
|
||||
=cut
|
||||
|
||||
sub username2id {
|
||||
my ($self, $user) = @_;
|
||||
$user ||= $self->config->user || $self->api->current_user->{id};
|
||||
unless ($user) {
|
||||
return ds_warn "Token seems invalid";
|
||||
return 1;
|
||||
}
|
||||
unless ($user =~ /^\d+$/) {
|
||||
return $self->cache->{user_id}->{$user}
|
||||
if $self->cache->{user_id}->{$user};
|
||||
my $users = $self->api->users({ username => $user });
|
||||
return ds_die "Username '$user' not found"
|
||||
unless ($users and @$users);
|
||||
ds_verbose "$user id is $users->[0]->{id}";
|
||||
$self->cache->{user_id}->{$user} = $users->[0]->{id};
|
||||
return $users->[0]->{id};
|
||||
}
|
||||
return $user;
|
||||
}
|
||||
|
||||
sub id2username {
|
||||
my ($self, $id) = @_;
|
||||
$id ||= $self->config->user_id || $self->api->current_user->{id};
|
||||
return $self->cache->{user}->{$id} if $self->cache->{user}->{$id};
|
||||
my $res = eval { $self->api->user($id)->{username} };
|
||||
if ($@) {
|
||||
ds_verbose $@;
|
||||
return ds_die "$id not found";
|
||||
}
|
||||
ds_verbose "$id is $res";
|
||||
$self->cache->{user}->{$id} = $res;
|
||||
return $res;
|
||||
}
|
||||
|
||||
=item B<group2id>: convert group name to id
|
||||
|
||||
=cut
|
||||
|
||||
sub group2id {
|
||||
my ($self, $name) = @_;
|
||||
$name ||= $self->config->group;
|
||||
return unless $name;
|
||||
if ($self->cache->{group_id}->{$name}) {
|
||||
$self->group_path($self->cache->{group_id}->{$name}->{path});
|
||||
return $self->group_id($self->cache->{group_id}->{$name}->{id});
|
||||
}
|
||||
my $groups = $self->api->group_without_projects($name);
|
||||
if ($groups) {
|
||||
$groups = [$groups];
|
||||
} else {
|
||||
$self->api->groups({ search => $name });
|
||||
}
|
||||
return ds_die "No group found" unless ($groups and @$groups);
|
||||
if (scalar @$groups > 1) {
|
||||
ds_warn "More than one group found:";
|
||||
foreach (@$groups) {
|
||||
print <<END;
|
||||
Id : $_->{id}
|
||||
Name : $_->{name}
|
||||
Full name: $_->{full_name}
|
||||
Full path: $_->{full_path}
|
||||
|
||||
END
|
||||
}
|
||||
return ds_die "Set the chosen group id using --group-id.";
|
||||
}
|
||||
ds_verbose "$name id is $groups->[0]->{id}";
|
||||
$self->cache->{group_id}->{$name}->{path}
|
||||
= $self->group_path($groups->[0]->{full_path});
|
||||
$self->cache->{group_id}->{$name}->{id} = $groups->[0]->{id};
|
||||
return $self->group_id($groups->[0]->{id});
|
||||
}
|
||||
|
||||
=item B<project2id>: get id of a project.
|
||||
|
||||
=cut
|
||||
|
||||
sub project2id {
|
||||
my ($self, $project) = @_;
|
||||
return $project if ($project =~ /^\d+$/);
|
||||
my $res;
|
||||
$project = $self->project2path($project);
|
||||
if ($self->projectCache->{$project}) {
|
||||
ds_debug "use cached id for $project";
|
||||
return $self->projectCache->{$project};
|
||||
}
|
||||
unless ($project =~ /^\d+$/) {
|
||||
eval { $res = $self->api->project($project)->{id}; };
|
||||
if ($@) {
|
||||
ds_debug $@;
|
||||
ds_warn "Project $project not found";
|
||||
return undef;
|
||||
}
|
||||
}
|
||||
ds_verbose "$project id is $res";
|
||||
$self->projectCache->{$project} = $res;
|
||||
return $res;
|
||||
}
|
||||
|
||||
=item B<project2path>: get full path of a project
|
||||
|
||||
=cut
|
||||
|
||||
sub project2path {
|
||||
my ($self, $project) = @_;
|
||||
return $project if ($project =~ m#/#);
|
||||
my $path = $self->main_path;
|
||||
return undef unless ($path);
|
||||
ds_verbose "Project $project => $path/$project";
|
||||
return "$path/$project";
|
||||
}
|
||||
|
||||
=item B<main_path>: build path using given group or user
|
||||
|
||||
=cut
|
||||
|
||||
sub main_path {
|
||||
my ($self) = @_;
|
||||
my $path;
|
||||
if ($self->config->path) {
|
||||
$path = $self->config->path;
|
||||
} elsif (my $tmp = $self->group_path) {
|
||||
$path = $tmp;
|
||||
} elsif ($self->user_id) {
|
||||
$path = $self->username;
|
||||
} else {
|
||||
ds_warn "Unable to determine project path";
|
||||
return undef;
|
||||
}
|
||||
return $path;
|
||||
}
|
||||
|
||||
# GitLab::API::v4 does not permit to call /groups/:id with parameters.
|
||||
# It takes too much time for the "debian" group, since it returns the list of
|
||||
# all projects together with all the details of the projects
|
||||
sub GitLab::API::v4::group_without_projects {
|
||||
my $self = shift;
|
||||
return $self->_call_rest_client('GET', 'groups/:group_id', [@_],
|
||||
{ query => { with_custom_attributes => 0, with_projects => 0 } });
|
||||
}
|
||||
|
||||
1;
|
||||
|
||||
=back
|
||||
|
||||
=head1 AUTHOR
|
||||
|
||||
Xavier Guimard E<lt>yadd@debian.orgE<gt>
|
||||
|
||||
=head1 COPYRIGHT AND LICENSE
|
||||
|
||||
Copyright 2018, Xavier Guimard E<lt>yadd@debian.orgE<gt>
|
524
lib/Devscripts/Salsa/Config.pm
Executable file
524
lib/Devscripts/Salsa/Config.pm
Executable file
|
@ -0,0 +1,524 @@
|
|||
# Salsa configuration (inherits from Devscripts::Config)
|
||||
package Devscripts::Salsa::Config;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo;
|
||||
|
||||
extends 'Devscripts::Config';
|
||||
|
||||
# Declare accessors for each option
|
||||
# Source : ./lib/Devscripts/Salsa/Config.pm:use constant keys
|
||||
# command & private_token
|
||||
# Skipping: info
|
||||
# Note : [Salsa = GitLab] jobs = builds, info = prompt, token = private_token
|
||||
foreach (qw(
|
||||
command private_token
|
||||
chdir cache_file no_cache path yes no_fail verbose debug
|
||||
user user_id group group_id token token_file
|
||||
all all_archived archived skip skip_file no_skip
|
||||
analytics auto_devops container environments feature_flags forks
|
||||
infrastructure issues jobs lfs monitor mr packages pages releases
|
||||
repo request_access requirements security_compliance service_desk snippets
|
||||
wiki
|
||||
avatar_path desc desc_pattern
|
||||
email disable_email email_recipient
|
||||
irc_channel
|
||||
irker disable_irker irker_host irker_port
|
||||
kgb disable_kgb kgb_options
|
||||
tagpending disable_tagpending
|
||||
rename_head source_branch dest_branch
|
||||
enable_remove_branch disable_remove_branch
|
||||
build_timeout ci_config_path
|
||||
schedule_desc schedule_ref schedule_cron schedule_tz schedule_enable
|
||||
schedule_disable schedule_run schedule_delete
|
||||
mr_allow_squash mr_desc mr_dst_branch mr_dst_project
|
||||
mr_remove_source_branch mr_src_branch mr_src_project mr_title
|
||||
api_url git_server_url irker_server_url kgb_server_url
|
||||
tagpending_server_url
|
||||
)
|
||||
) {
|
||||
has $_ => (is => 'rw');
|
||||
}
|
||||
|
||||
my $cacheDir;
|
||||
|
||||
our @kgbOpt = qw(
|
||||
push_events issues_events confidential_issues_events
|
||||
confidential_comments_events merge_requests_events tag_push_events
|
||||
note_events job_events pipeline_events wiki_page_events
|
||||
confidential_note_events enable_ssl_verification
|
||||
);
|
||||
|
||||
BEGIN {
|
||||
$cacheDir = $ENV{XDG_CACHE_HOME} || $ENV{HOME} . '/.cache';
|
||||
}
|
||||
|
||||
# Options
|
||||
use constant keys => [
|
||||
# General salsa
|
||||
[
|
||||
'C|chdir=s', undef,
|
||||
sub { return (chdir($_[1]) ? 1 : (0, "$_[1] doesn't exist")) }
|
||||
],
|
||||
[
|
||||
'cache-file',
|
||||
'SALSA_CACHE_FILE',
|
||||
sub {
|
||||
$_[0]->cache_file($_[1] ? $_[1] : undef);
|
||||
},
|
||||
"$cacheDir/salsa.json"
|
||||
],
|
||||
[
|
||||
'no-cache',
|
||||
'SALSA_NO_CACHE',
|
||||
sub {
|
||||
$_[0]->cache_file(undef)
|
||||
if ($_[1] !~ /^(?:no|0+)$/i);
|
||||
return 1;
|
||||
}
|
||||
],
|
||||
[
|
||||
'path=s',
|
||||
'SALSA_REPO_PATH',
|
||||
sub {
|
||||
$_ = $_[1];
|
||||
s#/*(.*)/*#$1#;
|
||||
$_[0]->path($_);
|
||||
return /^[\w\d\-]+$/ ? 1 : (0, "Bad path $_");
|
||||
}
|
||||
],
|
||||
|
||||
# Responses
|
||||
['yes!', 'SALSA_YES', sub { info(1, "SALSA_YES", @_) }],
|
||||
['no-fail', 'SALSA_NO_FAIL', 'bool'],
|
||||
|
||||
# Output
|
||||
['verbose!', 'SALSA_VERBOSE', sub { $verbose = 1 }],
|
||||
['debug', undef, sub { $verbose = 2 }],
|
||||
['info|i', 'SALSA_INFO', sub { info(-1, 'SALSA_INFO', @_) }],
|
||||
|
||||
# General GitLab
|
||||
['user=s', 'SALSA_USER', qr/^[\-\w]+$/],
|
||||
['user-id=s', 'SALSA_USER_ID', qr/^\d+$/],
|
||||
['group=s', 'SALSA_GROUP', qr/^[\/\-\w]+$/],
|
||||
['group-id=s', 'SALSA_GROUP_ID', qr/^\d+$/],
|
||||
['token', 'SALSA_TOKEN', sub { $_[0]->private_token($_[1]) }],
|
||||
[
|
||||
'token-file',
|
||||
'SALSA_TOKEN_FILE',
|
||||
sub {
|
||||
my ($self, $v) = @_;
|
||||
return (0, "Unable to open token file") unless (-r $v);
|
||||
open F, $v;
|
||||
my $s = join '', <F>;
|
||||
close F;
|
||||
if ($s
|
||||
=~ m/^[^#]*(?:SALSA_(?:PRIVATE_)?TOKEN)\s*=\s*(["'])?([-\w]+)\1?$/m
|
||||
) {
|
||||
$self->private_token($2);
|
||||
return 1;
|
||||
} else {
|
||||
return (0, "No token found in file $v");
|
||||
}
|
||||
}
|
||||
],
|
||||
|
||||
# List/search
|
||||
['all'],
|
||||
['all-archived'],
|
||||
['archived!', 'SALSA_ARCHIVED', 'bool', 0],
|
||||
['skip=s', 'SALSA_SKIP', undef, sub { [] }],
|
||||
[
|
||||
'skip-file=s',
|
||||
'SALSA_SKIP_FILE',
|
||||
sub {
|
||||
return 1 unless $_[1];
|
||||
return (0, "Unable to read $_[1]") unless (-r $_[1]);
|
||||
open my $fh, $_[1];
|
||||
push @{ $_[0]->skip }, (map { chomp $_; ($_ ? $_ : ()) } <$fh>);
|
||||
return 1;
|
||||
}
|
||||
],
|
||||
['no-skip', undef, sub { $_[0]->skip([]); $_[0]->skip_file(undef); }],
|
||||
|
||||
# Features
|
||||
[
|
||||
'analytics=s', 'SALSA_ENABLE_ANALYTICS',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'auto-devops=s',
|
||||
'SALSA_ENABLE_AUTO_DEVOPS',
|
||||
qr/y(es)?|true|enabled?|1|no?|false|disabled?|0/
|
||||
],
|
||||
[
|
||||
'container=s', 'SALSA_ENABLE_CONTAINER',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'environments=s',
|
||||
'SALSA_ENABLE_ENVIRONMENTS',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'feature-flags=s',
|
||||
'SALSA_ENABLE_FEATURE_FLAGS',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'forks=s', 'SALSA_ENABLE_FORKS',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'infrastructure=s',
|
||||
'SALSA_ENABLE_INFRASTRUCTURE',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'issues=s', 'SALSA_ENABLE_ISSUES',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
# Renamed terminology, kept for legacy: jobs == builds_access_level (ENABLE_JOBS -> ENABLE_BUILD)
|
||||
[
|
||||
'jobs=s', 'SALSA_ENABLE_JOBS',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'lfs=s', 'SALSA_ENABLE_LFS',
|
||||
qr/y(es)?|true|enabled?|1|no?|false|disabled?|0/
|
||||
],
|
||||
[
|
||||
'monitor=s', 'SALSA_ENABLE_MONITOR',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'mr=s', 'SALSA_ENABLE_MR',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'packages=s', 'SALSA_ENABLE_PACKAGES',
|
||||
qr/y(es)?|true|enabled?|1|no?|false|disabled?|0/
|
||||
],
|
||||
[
|
||||
'pages=s', 'SALSA_ENABLE_PAGES',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'releases=s', 'SALSA_ENABLE_RELEASES',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'repo=s', 'SALSA_ENABLE_REPO',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'request-access=s',
|
||||
'SALSA_REQUEST_ACCESS',
|
||||
qr/y(es)?|true|enabled?|1|no?|false|disabled?|0/
|
||||
],
|
||||
[
|
||||
'requirements=s',
|
||||
'SALSA_ENABLE_REQUIREMENTS',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'security-compliance=s',
|
||||
'SALSA_ENABLE_SECURITY_COMPLIANCE',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'service-desk=s',
|
||||
'SALSA_ENABLE_SERVICE_DESK',
|
||||
qr/y(es)?|true|enabled?|1|no?|false|disabled?|0/
|
||||
],
|
||||
[
|
||||
'snippets=s', 'SALSA_ENABLE_SNIPPETS',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
[
|
||||
'wiki=s', 'SALSA_ENABLE_WIKI',
|
||||
qr/y(es)?|true|enabled?|private|no?|false|disabled?/
|
||||
],
|
||||
|
||||
# Branding
|
||||
['avatar-path=s', 'SALSA_AVATAR_PATH', undef],
|
||||
['desc!', 'SALSA_DESC', 'bool'],
|
||||
['desc-pattern=s', 'SALSA_DESC_PATTERN', qr/\w/, 'Debian package %p'],
|
||||
|
||||
# Notification
|
||||
[
|
||||
'email!', undef,
|
||||
sub { !$_[1] or $_[0]->enable('yes', 'email', 'disable_email'); }
|
||||
],
|
||||
[
|
||||
'disable-email!', undef,
|
||||
sub { !$_[1] or $_[0]->enable('no', 'email', 'disable_email'); }
|
||||
],
|
||||
[
|
||||
undef, 'SALSA_EMAIL',
|
||||
sub { $_[0]->enable($_[1], 'email', 'disable_email'); }
|
||||
],
|
||||
['email-recipient=s', 'SALSA_EMAIL_RECIPIENTS', undef, sub { [] }],
|
||||
['irc-channel|irc=s', 'SALSA_IRC_CHANNEL', undef, sub { [] }],
|
||||
[
|
||||
'irker!', undef,
|
||||
sub { !$_[1] or $_[0]->enable('yes', 'irker', 'disable_irker'); }
|
||||
],
|
||||
[
|
||||
'disable-irker!', undef,
|
||||
sub { !$_[1] or $_[0]->enable('no', 'irker', 'disable_irker'); }
|
||||
],
|
||||
[
|
||||
undef, 'SALSA_IRKER',
|
||||
sub { $_[0]->enable($_[1], 'irker', 'disable_irker'); }
|
||||
],
|
||||
['irker-host=s', 'SALSA_IRKER_HOST', undef, 'ruprecht.snow-crash.org'],
|
||||
['irker-port=s', 'SALSA_IRKER_PORT', qr/^\d*$/],
|
||||
[
|
||||
'kgb!', undef,
|
||||
sub { !$_[1] or $_[0]->enable('yes', 'kgb', 'disable_kgb'); }
|
||||
],
|
||||
[
|
||||
'disable-kgb!', undef,
|
||||
sub { !$_[1] or $_[0]->enable('no', 'kgb', 'disable_kgb'); }
|
||||
],
|
||||
[undef, 'SALSA_KGB', sub { $_[0]->enable($_[1], 'kgb', 'disable_kgb'); }],
|
||||
[
|
||||
'kgb-options=s',
|
||||
'SALSA_KGB_OPTIONS',
|
||||
qr/\w/,
|
||||
'push_events,issues_events,merge_requests_events,tag_push_events,'
|
||||
. 'note_events,pipeline_events,wiki_page_events,'
|
||||
. 'enable_ssl_verification'
|
||||
],
|
||||
[
|
||||
'tagpending!',
|
||||
undef,
|
||||
sub {
|
||||
!$_[1]
|
||||
or $_[0]->enable('yes', 'tagpending', 'disable_tagpending');
|
||||
}
|
||||
],
|
||||
[
|
||||
'disable-tagpending!',
|
||||
undef,
|
||||
sub {
|
||||
!$_[1] or $_[0]->enable('no', 'tagpending', 'disable_tagpending');
|
||||
}
|
||||
],
|
||||
[
|
||||
undef, 'SALSA_TAGPENDING',
|
||||
sub { $_[0]->enable($_[1], 'tagpending', 'disable_tagpending'); }
|
||||
],
|
||||
|
||||
# Branch
|
||||
['rename-head!', 'SALSA_RENAME_HEAD', 'bool'],
|
||||
['source-branch=s', 'SALSA_SOURCE_BRANCH', undef, 'master'],
|
||||
['dest-branch=s', 'SALSA_DEST_BRANCH', undef, 'debian/latest'],
|
||||
[
|
||||
'enable-remove-source-branch!',
|
||||
undef,
|
||||
sub {
|
||||
!$_[1]
|
||||
or $_[0]
|
||||
->enable('yes', 'enable_remove_branch', 'disable_remove_branch');
|
||||
}
|
||||
],
|
||||
[
|
||||
'disable-remove-source-branch!',
|
||||
undef,
|
||||
sub {
|
||||
!$_[1]
|
||||
or $_[0]
|
||||
->enable('no', 'enable_remove_branch', 'disable_remove_branch');
|
||||
}
|
||||
],
|
||||
[
|
||||
undef,
|
||||
'SALSA_REMOVE_SOURCE_BRANCH',
|
||||
sub {
|
||||
$_[0]
|
||||
->enable($_[1], 'enable_remove_branch', 'disable_remove_branch');
|
||||
}
|
||||
],
|
||||
|
||||
# Merge requests
|
||||
['mr-allow-squash!', 'SALSA_MR_ALLOW_SQUASH', 'bool', 1],
|
||||
['mr-desc=s'],
|
||||
['mr-dst-branch=s', undef, undef, 'master'],
|
||||
['mr-dst-project=s'],
|
||||
['mr-remove-source-branch!', 'SALSA_MR_REMOVE_SOURCE_BRANCH', 'bool', 0],
|
||||
['mr-src-branch=s'],
|
||||
['mr-src-project=s'],
|
||||
['mr-title=s'],
|
||||
|
||||
# CI
|
||||
['build-timeout=s', 'SALSA_BUILD_TIMEOUT', qr/^\d+$/, '3600'],
|
||||
['ci-config-path=s', 'SALSA_CI_CONFIG_PATH', qr/\./],
|
||||
|
||||
# Pipeline schedules
|
||||
['schedule-desc=s', 'SALSA_SCHEDULE_DESC', qr/\w/],
|
||||
['schedule-ref=s', 'SALSA_SCHEDULE_REF'],
|
||||
['schedule-cron=s', 'SALSA_SCHEDULE_CRON'],
|
||||
['schedule-tz=s', 'SALSA_SCHEDULE_TZ'],
|
||||
['schedule-enable!', 'SALSA_SCHEDULE_ENABLE', 'bool'],
|
||||
['schedule-disable!', 'SALSA_SCHEDULE_DISABLE', 'bool'],
|
||||
['schedule-run!', 'SALSA_SCHEDULE_RUN', 'bool'],
|
||||
['schedule-delete!', 'SALSA_SCHEDULE_DELETE', 'bool'],
|
||||
|
||||
# Manage other GitLab instances
|
||||
[
|
||||
'api-url=s', 'SALSA_API_URL',
|
||||
qr#^https?://#, 'https://salsa.debian.org/api/v4'
|
||||
],
|
||||
[
|
||||
'git-server-url=s', 'SALSA_GIT_SERVER_URL',
|
||||
qr/^\S+\@\S+/, 'git@salsa.debian.org:'
|
||||
],
|
||||
[
|
||||
'irker-server-url=s', 'SALSA_IRKER_SERVER_URL',
|
||||
qr'^ircs?://', 'ircs://irc.oftc.net:6697/'
|
||||
],
|
||||
[
|
||||
'kgb-server-url=s', 'SALSA_KGB_SERVER_URL',
|
||||
qr'^https?://', 'https://kgb.debian.net/webhook/?channel='
|
||||
],
|
||||
[
|
||||
'tagpending-server-url=s',
|
||||
'SALSA_TAGPENDING_SERVER_URL',
|
||||
qr'^https?://',
|
||||
'https://webhook.salsa.debian.org/tagpending/'
|
||||
],
|
||||
];
|
||||
|
||||
# Consistency rules
|
||||
use constant rules => [
|
||||
# Reject unless token exists
|
||||
sub {
|
||||
return (1,
|
||||
"SALSA_TOKEN not set in configuration files. Some commands may fail"
|
||||
) unless ($_[0]->private_token);
|
||||
},
|
||||
# Get command
|
||||
sub {
|
||||
return (0, "No command given, aborting") unless (@ARGV);
|
||||
$_[0]->command(shift @ARGV);
|
||||
return (0, "Malformed command: " . $_[0]->command)
|
||||
unless ($_[0]->command =~ /^[a-z_]+$/);
|
||||
return 1;
|
||||
},
|
||||
sub {
|
||||
if ( ($_[0]->group or $_[0]->group_id)
|
||||
and ($_[0]->user_id or $_[0]->user)) {
|
||||
ds_warn "Both --user-id and --group-id are set, ignore --group-id";
|
||||
$_[0]->group(undef);
|
||||
$_[0]->group_id(undef);
|
||||
}
|
||||
return 1;
|
||||
},
|
||||
sub {
|
||||
if ($_[0]->group and $_[0]->group_id) {
|
||||
ds_warn "Both --group-id and --group are set, ignore --group";
|
||||
$_[0]->group(undef);
|
||||
}
|
||||
return 1;
|
||||
},
|
||||
sub {
|
||||
if ($_[0]->user and $_[0]->user_id) {
|
||||
ds_warn "Both --user-id and --user are set, ignore --user";
|
||||
$_[0]->user(undef);
|
||||
}
|
||||
return 1;
|
||||
},
|
||||
sub {
|
||||
if ($_[0]->email and not @{ $_[0]->email_recipient }) {
|
||||
return (0, '--email-recipient needed with --email');
|
||||
}
|
||||
return 1;
|
||||
},
|
||||
sub {
|
||||
if (@{ $_[0]->irc_channel }) {
|
||||
foreach (@{ $_[0]->irc_channel }) {
|
||||
if (/^#/) {
|
||||
return (1,
|
||||
"# found in --irc-channel, assuming double hash is wanted"
|
||||
);
|
||||
}
|
||||
}
|
||||
if ($_[0]->irc_channel->[1] and $_[0]->kgb) {
|
||||
return (0, "Only one IRC channel is accepted with --kgb");
|
||||
}
|
||||
}
|
||||
return 1;
|
||||
},
|
||||
sub {
|
||||
$_[0]->kgb_options([sort split ',\s*', $_[0]->kgb_options]);
|
||||
my @err;
|
||||
foreach my $o (@{ $_[0]->kgb_options }) {
|
||||
unless (grep { $_ eq $o } @kgbOpt) {
|
||||
push @err, $o;
|
||||
}
|
||||
}
|
||||
return (0, "Unknown KGB options: " . join(', ', @err))
|
||||
if @err;
|
||||
return 1;
|
||||
},
|
||||
];
|
||||
|
||||
sub usage {
|
||||
# Source: ./scripts/salsa.pl:=head1 SYNOPSIS
|
||||
# ./lib/Devscripts/Salsa.pm:sub run -> $ ls ./lib/Devscripts/Salsa/*.pm
|
||||
print <<END;
|
||||
usage: salsa <command> <parameters> <options>
|
||||
|
||||
Most used commands for managing users and groups:
|
||||
- add_user : Add a user to a group
|
||||
- delete_user : Remove a user from a group
|
||||
- search_groups : Search for a group using given string
|
||||
- search_users : Search for a user using given string
|
||||
- update_user : Update a user's role in a group
|
||||
- whoami : Gives information on the token owner
|
||||
|
||||
Most used commands for managing repositories:
|
||||
- checkout : Clone a project's repository in current directory
|
||||
- fork : Fork a project
|
||||
- last_ci_status : Displays the last continuous integration result
|
||||
- mr : Creates a merge request
|
||||
- schedules : Lists current pipeline schedule items
|
||||
- push_repo : Push local git repository to upstream repository
|
||||
- search_projects: Search for a project using given string
|
||||
- update_projects: Configure project(s) configuration
|
||||
- update_safe : Shows differences before running update_projects
|
||||
|
||||
See salsa(1) manpage for more.
|
||||
END
|
||||
}
|
||||
|
||||
sub info {
|
||||
my ($num, $key, undef, $nv) = @_;
|
||||
$nv = (
|
||||
$nv =~ /^yes|1$/ ? $num
|
||||
: $nv =~ /^no|0$/i ? 0
|
||||
: return (0, "Bad $key value"));
|
||||
$ds_yes = $nv;
|
||||
}
|
||||
|
||||
sub enable {
|
||||
my ($self, $v, $en, $dis) = @_;
|
||||
$v = lc($v);
|
||||
if ($v eq 'ignore') {
|
||||
$self->{$en} = $self->{$dis} = 0;
|
||||
} elsif ($v eq 'yes') {
|
||||
$self->{$en} = 1;
|
||||
$self->{$dis} = 0;
|
||||
} elsif ($v eq 'no') {
|
||||
$self->{$en} = 0;
|
||||
$self->{$dis} = 1;
|
||||
} else {
|
||||
return (0, "Bad value for SALSA_" . uc($en));
|
||||
}
|
||||
return 1;
|
||||
}
|
||||
|
||||
1;
|
314
lib/Devscripts/Salsa/Hooks.pm
Normal file
314
lib/Devscripts/Salsa/Hooks.pm
Normal file
|
@ -0,0 +1,314 @@
|
|||
# Common hooks library
|
||||
package Devscripts::Salsa::Hooks;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub add_hooks {
|
||||
my ($self, $repo_id, $repo) = @_;
|
||||
if ( $self->config->kgb
|
||||
or $self->config->disable_kgb
|
||||
or $self->config->tagpending
|
||||
or $self->config->disable_tagpending
|
||||
or $self->config->irker
|
||||
or $self->config->disable_irker
|
||||
or $self->config->email
|
||||
or $self->config->disable_email) {
|
||||
my $hooks = $self->enabled_hooks($repo_id);
|
||||
return 1 unless (defined $hooks);
|
||||
# KGB hook (IRC)
|
||||
if ($self->config->kgb or $self->config->disable_kgb) {
|
||||
unless ($self->config->irc_channel->[0]
|
||||
or $self->config->disable_kgb) {
|
||||
ds_warn "--kgb needs --irc-channel";
|
||||
return 1;
|
||||
}
|
||||
if ($self->config->irc_channel->[1]) {
|
||||
ds_warn "KGB accepts only one --irc-channel value,";
|
||||
}
|
||||
if ($hooks->{kgb}) {
|
||||
ds_warn "Deleting old kgb (was $hooks->{kgb}->{url})";
|
||||
$self->api->delete_project_hook($repo_id, $hooks->{kgb}->{id});
|
||||
}
|
||||
if ($self->config->irc_channel->[0]
|
||||
and not $self->config->disable_kgb) {
|
||||
# TODO: if useful, add parameters for this options
|
||||
eval {
|
||||
$self->api->create_project_hook(
|
||||
$repo_id,
|
||||
{
|
||||
url => $self->config->kgb_server_url
|
||||
. $self->config->irc_channel->[0],
|
||||
map { ($_ => 1) } @{ $self->config->kgb_options },
|
||||
});
|
||||
ds_verbose "KGB hook added to project $repo_id (channel: "
|
||||
. $self->config->irc_channel->[0] . ')';
|
||||
};
|
||||
if ($@) {
|
||||
ds_warn "Fail to add KGB hook: $@";
|
||||
if (!$self->config->no_fail) {
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
# Irker hook (IRC)
|
||||
if ($self->config->irker or $self->config->disable_irker) {
|
||||
unless ($self->config->irc_channel->[0]
|
||||
or $self->config->disable_irker) {
|
||||
ds_warn "--irker needs --irc-channel";
|
||||
return 1;
|
||||
}
|
||||
if ($hooks->{irker}) {
|
||||
no warnings;
|
||||
ds_warn
|
||||
"Deleting old irker (redirected to $hooks->{irker}->{recipients})";
|
||||
$self->api->delete_project_service($repo_id, 'irker');
|
||||
}
|
||||
if ($self->config->irc_channel->[0]
|
||||
and not $self->config->disable_irker) {
|
||||
# TODO: if useful, add parameters for this options
|
||||
my $ch = join(' ',
|
||||
map { '#' . $_ } @{ $self->config->irc_channel });
|
||||
$self->api->edit_project_service(
|
||||
$repo_id, 'irker',
|
||||
{
|
||||
active => 1,
|
||||
server_host => $self->config->irker_host,
|
||||
(
|
||||
$self->config->irker_port
|
||||
? (server_port => $self->config->irker_port)
|
||||
: ()
|
||||
),
|
||||
default_irc_uri => $self->config->irker_server_url,
|
||||
recipients => $ch,
|
||||
colorize_messages => 1,
|
||||
});
|
||||
ds_verbose
|
||||
"Irker hook added to project $repo_id (channel: $ch)";
|
||||
}
|
||||
}
|
||||
# email on push
|
||||
if ($self->config->email or $self->config->disable_email) {
|
||||
if ($hooks->{email}) {
|
||||
no warnings;
|
||||
ds_warn
|
||||
"Deleting old email-on-push (redirected to $hooks->{email}->{recipients})";
|
||||
$self->api->delete_project_service($repo_id, 'emails-on-push');
|
||||
}
|
||||
if (@{ $self->config->email_recipient }
|
||||
and not $self->config->disable_email) {
|
||||
# TODO: if useful, add parameters for this options
|
||||
$self->api->edit_project_service(
|
||||
$repo_id,
|
||||
'emails-on-push',
|
||||
{
|
||||
recipients => join(' ',
|
||||
map { my $a = $_; $a =~ s/%p/$repo/; $a }
|
||||
@{ $self->config->email_recipient }),
|
||||
});
|
||||
no warnings;
|
||||
ds_verbose
|
||||
"Email-on-push hook added to project $repo_id (recipients: "
|
||||
. join(' ', @{ $self->config->email_recipient }) . ')';
|
||||
}
|
||||
}
|
||||
# Tagpending hook
|
||||
if ($self->config->tagpending or $self->config->disable_tagpending) {
|
||||
if ($hooks->{tagpending}) {
|
||||
ds_warn
|
||||
"Deleting old tagpending (was $hooks->{tagpending}->{url})";
|
||||
$self->api->delete_project_hook($repo_id,
|
||||
$hooks->{tagpending}->{id});
|
||||
}
|
||||
my $repo_name = $self->api->project($repo_id)->{name};
|
||||
unless ($self->config->disable_tagpending) {
|
||||
eval {
|
||||
$self->api->create_project_hook(
|
||||
$repo_id,
|
||||
{
|
||||
url => $self->config->tagpending_server_url
|
||||
. $repo_name,
|
||||
push_events => 1,
|
||||
});
|
||||
ds_verbose "Tagpending hook added to project $repo_id";
|
||||
};
|
||||
if ($@) {
|
||||
ds_warn "Fail to add Tagpending hook: $@";
|
||||
if (!$self->config->no_fail) {
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
sub enabled_hooks {
|
||||
my ($self, $repo_id) = @_;
|
||||
my $hooks;
|
||||
my $res = {};
|
||||
if ( $self->config->kgb
|
||||
or $self->config->disable_kgb
|
||||
or $self->config->tagpending
|
||||
or $self->config->disable_tagpending) {
|
||||
$hooks = eval { $self->api->project_hooks($repo_id) };
|
||||
if ($@) {
|
||||
ds_warn "Unable to check hooks for project $repo_id";
|
||||
return undef;
|
||||
}
|
||||
foreach my $h (@{$hooks}) {
|
||||
$res->{kgb} = {
|
||||
id => $h->{id},
|
||||
url => $h->{url},
|
||||
options => [grep { $h->{$_} and $h->{$_} eq 1 } keys %$h],
|
||||
}
|
||||
if $h->{url} =~ /\Q$self->{config}->{kgb_server_url}\E/;
|
||||
$res->{tagpending} = {
|
||||
id => $h->{id},
|
||||
url => $h->{url},
|
||||
}
|
||||
if $h->{url} =~ /\Q$self->{config}->{tagpending_server_url}\E/;
|
||||
}
|
||||
}
|
||||
if ( ($self->config->email or $self->config->disable_email)
|
||||
and $_ = $self->api->project_service($repo_id, 'emails-on-push')
|
||||
and $_->{active}) {
|
||||
$res->{email} = $_->{properties};
|
||||
}
|
||||
if ( ($self->config->irker or $self->config->disable_irker)
|
||||
and $_ = $self->api->project_service($repo_id, 'irker')
|
||||
and $_->{active}) {
|
||||
$res->{irker} = $_->{properties};
|
||||
}
|
||||
return $res;
|
||||
}
|
||||
|
||||
sub _check_config {
|
||||
my ($config, $key_name, $config_name, $can_be_private, $res_ref) = @_;
|
||||
if (!$config) { return undef; }
|
||||
for ($config) {
|
||||
if ($can_be_private) {
|
||||
if ($_ eq "private") {
|
||||
push @$res_ref, $key_name => "private";
|
||||
} elsif ($_ =~ qr/y(es)?|true|enabled?/) {
|
||||
push @$res_ref, $key_name => "enabled";
|
||||
} elsif ($_ =~ qr/no?|false|disabled?/) {
|
||||
push @$res_ref, $key_name => "disabled";
|
||||
} else {
|
||||
print "error with SALSA_$config_name";
|
||||
}
|
||||
} else {
|
||||
if ($_ =~ qr/y(es)?|true|enabled?/) {
|
||||
push @$res_ref, $key_name => 1;
|
||||
} elsif ($_ =~ qr/no?|false|disabled?/) {
|
||||
push @$res_ref, $key_name => 0;
|
||||
} else {
|
||||
print "error with SALSA_$config_name";
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
sub desc {
|
||||
my ($self, $repo) = @_;
|
||||
my @res = ();
|
||||
if ($self->config->desc) {
|
||||
my $str = $self->config->desc_pattern;
|
||||
$str =~ s/%P/$repo/g;
|
||||
$repo =~ s#.*/##;
|
||||
$str =~ s/%p/$repo/g;
|
||||
push @res, description => $str;
|
||||
}
|
||||
if ($self->config->build_timeout) {
|
||||
push @res, build_timeout => $self->config->build_timeout;
|
||||
}
|
||||
if ($self->config->ci_config_path) {
|
||||
push @res, ci_config_path => $self->config->ci_config_path;
|
||||
}
|
||||
|
||||
# Parameter: config value, key name, config name, has private
|
||||
_check_config($self->config->analytics,
|
||||
"analytics_access_level", "ENABLE_ANALYTICS", 1, \@res);
|
||||
_check_config($self->config->auto_devops,
|
||||
"auto_devops_enabled", "ENABLE_AUTO_DEVOPS", 0, \@res);
|
||||
_check_config(
|
||||
$self->config->container,
|
||||
"container_registry_access_level",
|
||||
"ENABLE_CONTAINER", 1, \@res
|
||||
);
|
||||
_check_config($self->config->environments,
|
||||
"environments_access_level", "ENABLE_ENVIRONMENTS", 1, \@res);
|
||||
_check_config($self->config->feature_flags,
|
||||
"feature_flags_access_level", "ENABLE_FEATURE_FLAGS", 1, \@res);
|
||||
_check_config($self->config->forks, "forking_access_level",
|
||||
"ENABLE_FORKS", 1, \@res);
|
||||
_check_config($self->config->infrastructure,
|
||||
"infrastructure_access_level", "ENABLE_INFRASTRUCTURE", 1, \@res);
|
||||
_check_config($self->config->issues, "issues_access_level",
|
||||
"ENABLE_ISSUES", 1, \@res);
|
||||
# Renamed terminology, kept for legacy: jobs == builds_access_level (ENABLE_JOBS -> ENABLE_BUILD)
|
||||
_check_config($self->config->jobs, "builds_access_level", "ENABLE_JOBS",
|
||||
1, \@res);
|
||||
_check_config($self->config->lfs, "lfs_enabled", "ENABLE_LFS", 0, \@res);
|
||||
_check_config($self->config->mr, "merge_requests_access_level",
|
||||
"ENABLE_MR", 1, \@res);
|
||||
_check_config($self->config->monitor,
|
||||
"monitor_access_level", "ENABLE_MONITOR", 1, \@res);
|
||||
_check_config($self->config->packages,
|
||||
"packages_enabled", "ENABLE_PACKAGES", 0, \@res);
|
||||
_check_config($self->config->pages, "pages_access_level", "ENABLE_PAGES",
|
||||
1, \@res);
|
||||
_check_config($self->config->releases,
|
||||
"releases_access_level", "ENABLE_RELEASES", 1, \@res);
|
||||
_check_config(
|
||||
$self->config->disable_remove_branch,
|
||||
"remove_source_branch_after_merge",
|
||||
"REMOVE_SOURCE_BRANCH", 0, \@res
|
||||
);
|
||||
_check_config($self->config->repo, "repository_access_level",
|
||||
"ENABLE_REPO", 1, \@res);
|
||||
_check_config($self->config->request_access,
|
||||
"request_access_enabled", "REQUEST_ACCESS", 0, \@res);
|
||||
_check_config($self->config->requirements,
|
||||
"requirements_access_level", "ENABLE_REQUIREMENTS", 1, \@res);
|
||||
_check_config(
|
||||
$self->config->security_compliance,
|
||||
"security_and_compliance_access_level",
|
||||
"ENABLE_SECURITY_COMPLIANCE", 1, \@res
|
||||
);
|
||||
_check_config($self->config->service_desk,
|
||||
"service_desk_enabled", "ENABLE_SERVICE_DESK", 0, \@res);
|
||||
_check_config($self->config->snippets,
|
||||
"snippets_access_level", "ENABLE_SNIPPETS", 1, \@res);
|
||||
_check_config($self->config->wiki, "wiki_access_level", "ENABLE_WIKI", 1,
|
||||
\@res);
|
||||
|
||||
return @res;
|
||||
}
|
||||
|
||||
sub desc_multipart {
|
||||
my ($self, $repo) = @_;
|
||||
my @res = ();
|
||||
if ($self->config->avatar_path) {
|
||||
my $str = $self->config->avatar_path;
|
||||
$str =~ s/%p/$repo/g;
|
||||
unless (-r $str) {
|
||||
ds_warn "Unable to find: $str";
|
||||
unless ($self->config->no_fail) {
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
exit 1;
|
||||
}
|
||||
} else {
|
||||
# avatar_path (salsa) -> avatar (GitLab API)
|
||||
push @res, avatar => $str;
|
||||
}
|
||||
}
|
||||
return @res;
|
||||
}
|
||||
|
||||
1;
|
75
lib/Devscripts/Salsa/Repo.pm
Executable file
75
lib/Devscripts/Salsa/Repo.pm
Executable file
|
@ -0,0 +1,75 @@
|
|||
# Common method to get projects
|
||||
package Devscripts::Salsa::Repo;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
with "Devscripts::Salsa::Hooks";
|
||||
|
||||
sub get_repo {
|
||||
my ($self, $prompt, @reponames) = @_;
|
||||
my @repos;
|
||||
if (($self->config->all or $self->config->all_archived)
|
||||
and @reponames == 0) {
|
||||
ds_debug "--all is set";
|
||||
my $options = {};
|
||||
$options->{order_by} = 'name';
|
||||
$options->{sort} = 'asc';
|
||||
$options->{archived} = 'false' if not $self->config->all_archived;
|
||||
$options->{with_shared}
|
||||
= 'false'; # do not operate on foreign projects shared with us
|
||||
my $projects;
|
||||
# This rule disallow trying to configure all "Debian" projects:
|
||||
# - Debian id is 2
|
||||
# - next is 1987
|
||||
if ($self->group_id) {
|
||||
$projects
|
||||
= $self->api->paginator('group_projects', $self->group_id,
|
||||
$options)->all;
|
||||
} elsif ($self->user_id) {
|
||||
$projects
|
||||
= $self->api->paginator('user_projects', $self->user_id,
|
||||
$options)->all;
|
||||
} else {
|
||||
ds_warn "Missing or invalid token";
|
||||
return 1;
|
||||
}
|
||||
unless ($projects) {
|
||||
ds_warn "No projects found";
|
||||
return 1;
|
||||
}
|
||||
@repos = map {
|
||||
$self->projectCache->{ $_->{path_with_namespace} } = $_->{id};
|
||||
[$_->{id}, $_->{path}]
|
||||
} @$projects;
|
||||
if (@{ $self->config->skip }) {
|
||||
@repos = map {
|
||||
my $res = 1;
|
||||
foreach my $k (@{ $self->config->skip }) {
|
||||
$res = 0 if ($_->[1] =~ m#(?:.*/)?\Q$k\E#);
|
||||
}
|
||||
$res ? $_ : ();
|
||||
} @repos;
|
||||
}
|
||||
if ($ds_yes > 0 or !$prompt) {
|
||||
ds_verbose "Found " . @repos . " projects";
|
||||
} else {
|
||||
unless (
|
||||
ds_prompt(
|
||||
"You're going to configure "
|
||||
. @repos
|
||||
. " projects. Continue (N/y) "
|
||||
) =~ accept
|
||||
) {
|
||||
ds_warn "Aborting";
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
@repos = map { [$self->project2id($_), $_] } @reponames;
|
||||
}
|
||||
return @repos;
|
||||
}
|
||||
|
||||
1;
|
40
lib/Devscripts/Salsa/add_user.pm
Normal file
40
lib/Devscripts/Salsa/add_user.pm
Normal file
|
@ -0,0 +1,40 @@
|
|||
# Adds a user in a group with a role
|
||||
package Devscripts::Salsa::add_user;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub add_user {
|
||||
my ($self, $level, $user) = @_;
|
||||
unless ($level and $user) {
|
||||
ds_warn "Usage $0 --group-id 1234 add_user <level> <userid>";
|
||||
return 1;
|
||||
}
|
||||
unless ($self->group_id) {
|
||||
ds_warn "Unable to add user without --group or --group-id";
|
||||
return 1;
|
||||
}
|
||||
|
||||
my $id = $self->username2id($user) or return 1;
|
||||
my $al = $self->levels_name($level) or return 1;
|
||||
return 1
|
||||
if (
|
||||
$ds_yes < 0
|
||||
and ds_prompt(
|
||||
"You're going to accept $user as $level in group $self->{group_id}. Continue (Y/n) "
|
||||
) =~ refuse
|
||||
);
|
||||
$self->api->add_group_member(
|
||||
$self->group_id,
|
||||
{
|
||||
user_id => $id,
|
||||
access_level => $al,
|
||||
});
|
||||
ds_warn "User $user added to group "
|
||||
. $self->group_id
|
||||
. " with role $level";
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
224
lib/Devscripts/Salsa/check_repo.pm
Executable file
224
lib/Devscripts/Salsa/check_repo.pm
Executable file
|
@ -0,0 +1,224 @@
|
|||
# Parses repo to check if parameters are well set
|
||||
package Devscripts::Salsa::check_repo;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Digest::MD5 qw(md5_hex);
|
||||
use Digest::file qw(digest_file_hex);
|
||||
use LWP::UserAgent;
|
||||
use Moo::Role;
|
||||
|
||||
with "Devscripts::Salsa::Repo";
|
||||
|
||||
sub check_repo {
|
||||
my $self = shift;
|
||||
my ($res) = $self->_check_repo(@_);
|
||||
return $res;
|
||||
}
|
||||
|
||||
sub _url_md5_hex {
|
||||
my $url = shift;
|
||||
my $ua = LWP::UserAgent->new;
|
||||
my $res = $ua->get($url, "User-Agent" => "Devscripts/2.22.3",);
|
||||
if (!$res->is_success) {
|
||||
return undef;
|
||||
}
|
||||
return Digest::MD5::md5_hex($res->content);
|
||||
}
|
||||
|
||||
sub _check_repo {
|
||||
my ($self, @reponames) = @_;
|
||||
my $res = 0;
|
||||
my @fail;
|
||||
unless (@reponames or $self->config->all or $self->config->all_archived) {
|
||||
ds_warn "Usage $0 check_repo <--all|--all-archived|names>";
|
||||
return 1;
|
||||
}
|
||||
if (@reponames and $self->config->all) {
|
||||
ds_warn "--all with a reponame makes no sense";
|
||||
return 1;
|
||||
}
|
||||
if (@reponames and $self->config->all_archived) {
|
||||
ds_warn "--all-archived with a reponame makes no sense";
|
||||
return 1;
|
||||
}
|
||||
# Get repo list from Devscripts::Salsa::Repo
|
||||
my @repos = $self->get_repo(0, @reponames);
|
||||
return @repos unless (ref $repos[0]);
|
||||
foreach my $repo (@repos) {
|
||||
my @err;
|
||||
my ($id, $name) = @$repo;
|
||||
my $project = eval { $self->api->project($id) };
|
||||
unless ($project) {
|
||||
ds_debug $@;
|
||||
ds_warn "Project $name not found";
|
||||
next;
|
||||
}
|
||||
ds_debug "Checking $name ($id)";
|
||||
# check description
|
||||
my %prms = $self->desc($name);
|
||||
my %prms_multipart = $self->desc_multipart($name);
|
||||
if ($self->config->desc) {
|
||||
$project->{description} //= '';
|
||||
push @err, "bad description: $project->{description}"
|
||||
if ($prms{description} ne $project->{description});
|
||||
}
|
||||
# check build timeout
|
||||
if ($self->config->desc) {
|
||||
$project->{build_timeout} //= '';
|
||||
push @err, "bad build_timeout: $project->{build_timeout}"
|
||||
if ($prms{build_timeout} ne $project->{build_timeout});
|
||||
}
|
||||
# check features (w/permission) & ci config
|
||||
foreach (qw(
|
||||
analytics_access_level
|
||||
auto_devops_enabled
|
||||
builds_access_level
|
||||
ci_config_path
|
||||
container_registry_access_level
|
||||
environments_access_level
|
||||
feature_flags_access_level
|
||||
forking_access_level
|
||||
infrastructure_access_level
|
||||
issues_access_level
|
||||
lfs_enabled
|
||||
merge_requests_access_level
|
||||
monitor_access_level
|
||||
packages_enabled
|
||||
pages_access_level
|
||||
releases_access_level
|
||||
remove_source_branch_after_merge
|
||||
repository_access_level
|
||||
request_access_enabled
|
||||
requirements_access_level
|
||||
security_and_compliance_access_level
|
||||
service_desk_enabled
|
||||
snippets_access_level
|
||||
wiki_access_level
|
||||
)
|
||||
) {
|
||||
my $helptext = '';
|
||||
$helptext = ' (enabled)'
|
||||
if (defined $prms{$_} and $prms{$_} eq 1);
|
||||
$helptext = ' (disabled)'
|
||||
if (defined $prms{$_} and $prms{$_} eq 0);
|
||||
push @err, "$_ should be $prms{$_}$helptext"
|
||||
if (defined $prms{$_}
|
||||
and (!defined($project->{$_}) or $project->{$_} ne $prms{$_}));
|
||||
}
|
||||
# only public projects are accepted
|
||||
push @err, "Project visibility: $project->{visibility}"
|
||||
unless ($project->{visibility} eq "public");
|
||||
# Default branch
|
||||
if ($self->config->rename_head) {
|
||||
push @err, "Default branch: $project->{default_branch}"
|
||||
if ($project->{default_branch} ne $self->config->dest_branch);
|
||||
}
|
||||
# Webhooks (from Devscripts::Salsa::Hooks)
|
||||
my $hooks = $self->enabled_hooks($id);
|
||||
unless (defined $hooks) {
|
||||
ds_warn "Unable to get $name hooks";
|
||||
next;
|
||||
}
|
||||
# check avatar's path
|
||||
if ($self->config->avatar_path) {
|
||||
my ($md5_file, $md5_url) = "";
|
||||
if ($prms_multipart{avatar}) {
|
||||
ds_verbose "Calculating local avatar checksum";
|
||||
$md5_file = digest_file_hex($prms_multipart{avatar}, "MD5")
|
||||
or die "$prms_multipart{avatar} failed md5: $!";
|
||||
if ( $project->{avatar_url}
|
||||
and $project->{visibility} eq "public") {
|
||||
ds_verbose "Calculating remote avatar checksum";
|
||||
$md5_url = _url_md5_hex($project->{avatar_url})
|
||||
or die "$project->{avatar_url} failed md5: $!";
|
||||
# Will always force avatar if it can't detect
|
||||
} elsif ($project->{avatar_url}) {
|
||||
ds_warn
|
||||
"$name has an avatar, but is set to $project->{visibility} project visibility thus unable to remotely check checksum";
|
||||
}
|
||||
push @err, "Will set the avatar to be: $prms_multipart{avatar}"
|
||||
if (not length $md5_url or $md5_file ne $md5_url);
|
||||
}
|
||||
}
|
||||
# KGB
|
||||
if ($self->config->kgb and not $hooks->{kgb}) {
|
||||
push @err, "kgb missing";
|
||||
} elsif ($self->config->disable_kgb and $hooks->{kgb}) {
|
||||
push @err, "kgb enabled";
|
||||
} elsif ($self->config->kgb) {
|
||||
push @err,
|
||||
"bad irc channel: "
|
||||
. substr($hooks->{kgb}->{url},
|
||||
length($self->config->kgb_server_url))
|
||||
if $hooks->{kgb}->{url} ne $self->config->kgb_server_url
|
||||
. $self->config->irc_channel->[0];
|
||||
my @wopts = @{ $self->config->kgb_options };
|
||||
my @gopts = sort @{ $hooks->{kgb}->{options} };
|
||||
my $i = 0;
|
||||
while (@gopts and @wopts) {
|
||||
my $a;
|
||||
$a = ($wopts[0] cmp $gopts[0]);
|
||||
if ($a == -1) {
|
||||
push @err, "Missing KGB option " . shift(@wopts);
|
||||
} elsif ($a == 1) {
|
||||
push @err, 'Unwanted KGB option ' . shift(@gopts);
|
||||
} else {
|
||||
shift @wopts;
|
||||
shift @gopts;
|
||||
}
|
||||
}
|
||||
push @err, map { "Missing KGB option $_" } @wopts;
|
||||
push @err, map { "Unwanted KGB option $_" } @gopts;
|
||||
}
|
||||
# Email-on-push
|
||||
if ($self->config->email
|
||||
and not($hooks->{email} and %{ $hooks->{email} })) {
|
||||
push @err, "email-on-push missing";
|
||||
} elsif (
|
||||
$self->config->email
|
||||
and $hooks->{email}->{recipients} ne join(
|
||||
' ',
|
||||
map {
|
||||
my $a = $_;
|
||||
my $b = $name;
|
||||
$b =~ s#.*/##;
|
||||
$a =~ s/%p/$b/;
|
||||
$a
|
||||
} @{ $self->config->email_recipient })
|
||||
) {
|
||||
push @err, "bad email recipients " . $hooks->{email}->{recipients};
|
||||
} elsif ($self->config->disable_email and $hooks->{kgb}) {
|
||||
push @err, "email-on-push enabled";
|
||||
}
|
||||
# Irker
|
||||
if ($self->config->irker and not $hooks->{irker}) {
|
||||
push @err, "irker missing";
|
||||
} elsif ($self->config->irker
|
||||
and $hooks->{irker}->{recipients} ne
|
||||
join(' ', map { "#$_" } @{ $self->config->irc_channel })) {
|
||||
push @err, "bad irc channel: " . $hooks->{irker}->{recipients};
|
||||
} elsif ($self->config->disable_irker and $hooks->{irker}) {
|
||||
push @err, "irker enabled";
|
||||
}
|
||||
# Tagpending
|
||||
if ($self->config->tagpending and not $hooks->{tagpending}) {
|
||||
push @err, "tagpending missing";
|
||||
} elsif ($self->config->disable_tagpending
|
||||
and $hooks->{tagpending}) {
|
||||
push @err, "tagpending enabled";
|
||||
}
|
||||
# report errors
|
||||
if (@err) {
|
||||
$res++;
|
||||
push @fail, $name;
|
||||
print "$name:\n";
|
||||
print "\t$_\n" foreach (@err);
|
||||
} else {
|
||||
ds_verbose "$name: OK";
|
||||
}
|
||||
}
|
||||
return ($res, \@fail);
|
||||
}
|
||||
|
||||
1;
|
81
lib/Devscripts/Salsa/checkout.pm
Normal file
81
lib/Devscripts/Salsa/checkout.pm
Normal file
|
@ -0,0 +1,81 @@
|
|||
# Clones or updates a project's repository using gbp
|
||||
# TODO: git-dpm ?
|
||||
package Devscripts::Salsa::checkout;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Devscripts::Utils;
|
||||
use Dpkg::IPC;
|
||||
use Moo::Role;
|
||||
|
||||
with "Devscripts::Salsa::Repo";
|
||||
|
||||
sub checkout {
|
||||
my ($self, @repos) = @_;
|
||||
unless (@repos or $self->config->all or $self->config->all_archived) {
|
||||
ds_warn "Usage $0 checkout <--all|--all-archived|names>";
|
||||
return 1;
|
||||
}
|
||||
if (@repos and $self->config->all) {
|
||||
ds_warn "--all with a project name makes no sense";
|
||||
return 1;
|
||||
}
|
||||
if (@repos and $self->config->all_archived) {
|
||||
ds_warn "--all-archived with a project name makes no sense";
|
||||
return 1;
|
||||
}
|
||||
# If --all is asked, launch all projects
|
||||
@repos = map { $_->[1] } $self->get_repo(0, @repos) unless (@repos);
|
||||
my $cdir = `pwd`;
|
||||
chomp $cdir;
|
||||
my $res = 0;
|
||||
foreach (@repos) {
|
||||
my $path = $self->project2path($_);
|
||||
s#.*/##;
|
||||
s#^https://salsa.debian.org/##;
|
||||
s#\.git$##;
|
||||
if (-d $_) {
|
||||
chdir $_;
|
||||
ds_verbose "Updating existing checkout in $_";
|
||||
spawn(
|
||||
exec => ['gbp', 'pull', '--pristine-tar'],
|
||||
wait_child => 1,
|
||||
nocheck => 1,
|
||||
);
|
||||
if ($?) {
|
||||
$res++;
|
||||
if ($self->config->no_fail) {
|
||||
print STDERR "gbp pull fails in $_\n";
|
||||
} else {
|
||||
ds_warn "gbp pull failed in $_\n";
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
chdir $cdir;
|
||||
} else {
|
||||
spawn(
|
||||
exec => [
|
||||
'gbp', 'clone',
|
||||
'--all', $self->config->git_server_url . $path . ".git"
|
||||
],
|
||||
wait_child => 1,
|
||||
nocheck => 1,
|
||||
);
|
||||
if ($?) {
|
||||
$res++;
|
||||
if ($self->config->no_fail) {
|
||||
print STDERR "gbp clone fails in $_\n";
|
||||
} else {
|
||||
ds_warn "gbp clone failed for $_\n";
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
ds_warn "$_ ready in $_/";
|
||||
}
|
||||
}
|
||||
return $res;
|
||||
}
|
||||
|
||||
1;
|
47
lib/Devscripts/Salsa/create_repo.pm
Normal file
47
lib/Devscripts/Salsa/create_repo.pm
Normal file
|
@ -0,0 +1,47 @@
|
|||
# Creates project using name or path
|
||||
package Devscripts::Salsa::create_repo; # create_project
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Dpkg::IPC;
|
||||
use Moo::Role;
|
||||
|
||||
with "Devscripts::Salsa::Hooks";
|
||||
|
||||
sub create_repo {
|
||||
my ($self, $reponame) = @_;
|
||||
unless ($reponame) {
|
||||
ds_warn "Project name is missing";
|
||||
return 1;
|
||||
}
|
||||
# Get parameters from Devscripts::Salsa::Repo
|
||||
my $opts = {
|
||||
name => $reponame,
|
||||
path => $reponame,
|
||||
visibility => 'public',
|
||||
$self->desc($reponame),
|
||||
};
|
||||
if ($self->group_id) {
|
||||
$opts->{namespace_id} = $self->group_id;
|
||||
}
|
||||
return 1
|
||||
if (
|
||||
$ds_yes < 0
|
||||
and ds_prompt(
|
||||
"You're going to create $reponame in "
|
||||
. ($self->group_id ? $self->group_path : 'your namespace')
|
||||
. ". Continue (Y/n) "
|
||||
) =~ refuse
|
||||
);
|
||||
my $repo = eval { $self->api->create_project($opts) };
|
||||
if ($@ or !$repo) {
|
||||
ds_warn "Project not created: $@";
|
||||
return 1;
|
||||
}
|
||||
ds_warn "Project $repo->{web_url} created";
|
||||
$reponame =~ s#^.*/##;
|
||||
$self->add_hooks($repo->{id}, $reponame);
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
26
lib/Devscripts/Salsa/del_repo.pm
Normal file
26
lib/Devscripts/Salsa/del_repo.pm
Normal file
|
@ -0,0 +1,26 @@
|
|||
# Deletes a project
|
||||
package Devscripts::Salsa::del_repo; # delete_project
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Dpkg::IPC;
|
||||
use Moo::Role;
|
||||
|
||||
sub del_repo {
|
||||
my ($self, $reponame) = @_;
|
||||
unless ($reponame) {
|
||||
ds_warn "Project name or path is missing";
|
||||
return 1;
|
||||
}
|
||||
my $id = $self->project2id($reponame) or return 1;
|
||||
my $path = $self->project2path($reponame);
|
||||
return 1
|
||||
if ($ds_yes < 0
|
||||
and ds_prompt("You're going to delete $path. Continue (Y/n) ")
|
||||
=~ refuse);
|
||||
$self->api->delete_project($id);
|
||||
ds_warn "Project $path deleted";
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
32
lib/Devscripts/Salsa/del_user.pm
Normal file
32
lib/Devscripts/Salsa/del_user.pm
Normal file
|
@ -0,0 +1,32 @@
|
|||
# Removes a user from a group
|
||||
package Devscripts::Salsa::del_user; # delete_user
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub del_user {
|
||||
my ($self, $user) = @_;
|
||||
unless ($user) {
|
||||
ds_warn "Usage $0 delete_user <user>";
|
||||
return 1;
|
||||
}
|
||||
unless ($self->group_id) {
|
||||
ds_warn "Unable to remove user without --group-id";
|
||||
return 1;
|
||||
}
|
||||
|
||||
my $id = $self->username2id($user) or return 1;
|
||||
return 1
|
||||
if (
|
||||
$ds_yes < 0
|
||||
and ds_prompt(
|
||||
"You're going to remove $user from group $self->{group_id}. Continue (Y/n) "
|
||||
) =~ refuse
|
||||
);
|
||||
$self->api->remove_group_member($self->group_id, $id);
|
||||
ds_warn "User $user removed from group " . $self->group_id;
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
36
lib/Devscripts/Salsa/fork.pm
Normal file
36
lib/Devscripts/Salsa/fork.pm
Normal file
|
@ -0,0 +1,36 @@
|
|||
# Forks a project given by full path into group/user namespace
|
||||
package Devscripts::Salsa::fork;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Dpkg::IPC;
|
||||
use Moo::Role;
|
||||
|
||||
with 'Devscripts::Salsa::checkout';
|
||||
|
||||
sub fork {
|
||||
my ($self, $project) = @_;
|
||||
unless ($project) {
|
||||
ds_warn "Project to fork is missing";
|
||||
return 1;
|
||||
}
|
||||
my $path = $self->main_path or return 1;
|
||||
$self->api->fork_project($project, { namespace => $path });
|
||||
my $p = $project;
|
||||
$p =~ s#.*/##;
|
||||
if ($self->checkout($p)) {
|
||||
ds_warn "Failed to checkout $project";
|
||||
return 1;
|
||||
}
|
||||
chdir $p;
|
||||
spawn(
|
||||
exec => [
|
||||
qw(git remote add upstream),
|
||||
$self->config->git_server_url . $project
|
||||
],
|
||||
wait_child => 1
|
||||
);
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
45
lib/Devscripts/Salsa/forks.pm
Normal file
45
lib/Devscripts/Salsa/forks.pm
Normal file
|
@ -0,0 +1,45 @@
|
|||
# Lists forks of a project
|
||||
package Devscripts::Salsa::forks;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub forks {
|
||||
my ($self, @reponames) = @_;
|
||||
my $res = 0;
|
||||
unless (@reponames) {
|
||||
ds_warn "Project name is missing";
|
||||
return 1;
|
||||
}
|
||||
foreach my $p (@reponames) {
|
||||
my $id = $self->project2id($p);
|
||||
unless ($id) {
|
||||
ds_warn "Project $_ not found";
|
||||
$res++;
|
||||
next;
|
||||
}
|
||||
print "$p\n";
|
||||
my $forks = $self->api->paginator(
|
||||
'project_forks',
|
||||
$id,
|
||||
{
|
||||
state => 'opened',
|
||||
});
|
||||
unless ($forks) {
|
||||
print "\n";
|
||||
next;
|
||||
}
|
||||
while ($_ = $forks->next) {
|
||||
print <<END;
|
||||
\tId : $_->{id}
|
||||
\tName: $_->{path_with_namespace}
|
||||
\tURL : $_->{web_url}
|
||||
|
||||
END
|
||||
}
|
||||
}
|
||||
return $res;
|
||||
}
|
||||
|
||||
1;
|
35
lib/Devscripts/Salsa/group.pm
Normal file
35
lib/Devscripts/Salsa/group.pm
Normal file
|
@ -0,0 +1,35 @@
|
|||
# Lists members of a group
|
||||
package Devscripts::Salsa::group; # list_users
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub group {
|
||||
my ($self) = @_;
|
||||
my $count = 0;
|
||||
unless ($self->group_id) {
|
||||
ds_warn "Usage $0 --group-id 1234 list_users";
|
||||
return 1;
|
||||
}
|
||||
my $users = $self->api->paginator('group_members', $self->group_id);
|
||||
while ($_ = $users->next) {
|
||||
$count++;
|
||||
my $access_level = $self->levels_code($_->{access_level});
|
||||
print <<END;
|
||||
Id : $_->{id}
|
||||
Username : $_->{username}
|
||||
Name : $_->{name}
|
||||
Access level: $access_level
|
||||
State : $_->{state}
|
||||
|
||||
END
|
||||
}
|
||||
unless ($count) {
|
||||
ds_warn "No users found";
|
||||
return 1;
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
20
lib/Devscripts/Salsa/join.pm
Normal file
20
lib/Devscripts/Salsa/join.pm
Normal file
|
@ -0,0 +1,20 @@
|
|||
# Launch request to join a group
|
||||
package Devscripts::Salsa::join;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub join {
|
||||
my ($self, $group) = @_;
|
||||
unless ($group ||= $self->config->group || $self->config->group_id) {
|
||||
ds_warn "Group is missing";
|
||||
return 1;
|
||||
}
|
||||
my $gid = $self->group2id($group);
|
||||
$self->api->group_access_requests($gid);
|
||||
ds_warn "Request launched to group $group ($gid)";
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
77
lib/Devscripts/Salsa/last_ci_status.pm
Normal file
77
lib/Devscripts/Salsa/last_ci_status.pm
Normal file
|
@ -0,0 +1,77 @@
|
|||
package Devscripts::Salsa::last_ci_status;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
with "Devscripts::Salsa::Repo";
|
||||
|
||||
use constant OK => 'success';
|
||||
use constant SKIPPED => 'skipped';
|
||||
use constant FAILED => 'failed';
|
||||
|
||||
sub last_ci_status {
|
||||
my ($self, @repos) = @_;
|
||||
unless (@repos or $self->config->all or $self->config->all_archived) {
|
||||
ds_warn "Usage $0 ci_status <--all|--all-archived|names>";
|
||||
return 1;
|
||||
}
|
||||
if (@repos and $self->config->all) {
|
||||
ds_warn "--all with a project name makes no sense";
|
||||
return 1;
|
||||
}
|
||||
if (@repos and $self->config->all_archived) {
|
||||
ds_warn "--all-archived with a project name makes no sense";
|
||||
return 1;
|
||||
}
|
||||
# If --all is asked, launch all projects
|
||||
@repos = map { $_->[1] } $self->get_repo(0, @repos) unless (@repos);
|
||||
my $ret = 0;
|
||||
foreach my $repo (@repos) {
|
||||
my $id = $self->project2id($repo) or return 1;
|
||||
my $pipelines = $self->api->pipelines($id);
|
||||
unless ($pipelines and @$pipelines) {
|
||||
ds_warn "No pipelines for $repo";
|
||||
$ret++;
|
||||
unless ($self->config->no_fail) {
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
} else {
|
||||
my $status = $pipelines->[0]->{status};
|
||||
if ($status eq OK) {
|
||||
print "Last result for $repo: $status\n";
|
||||
} else {
|
||||
print STDERR "Last result for $repo: $status\n";
|
||||
my $jobs
|
||||
= $self->api->pipeline_jobs($id, $pipelines->[0]->{id});
|
||||
my %jres;
|
||||
foreach my $job (sort { $a->{id} <=> $b->{id} } @$jobs) {
|
||||
next if $job->{status} eq SKIPPED;
|
||||
push @{ $jres{ $job->{status} } }, $job->{name};
|
||||
}
|
||||
if ($jres{ OK() }) {
|
||||
print STDERR ' success: '
|
||||
. join(', ', @{ $jres{ OK() } }) . "\n";
|
||||
delete $jres{ OK() };
|
||||
}
|
||||
foreach my $k (sort keys %jres) {
|
||||
print STDERR ' '
|
||||
. uc($k) . ': '
|
||||
. join(', ', @{ $jres{$k} }) . "\n";
|
||||
}
|
||||
print STDERR "\n See: " . $pipelines->[0]->{web_url} . "\n\n";
|
||||
if ($status eq FAILED) {
|
||||
$ret++;
|
||||
unless ($self->config->no_fail) {
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return $ret;
|
||||
}
|
||||
|
||||
1;
|
40
lib/Devscripts/Salsa/list_groups.pm
Normal file
40
lib/Devscripts/Salsa/list_groups.pm
Normal file
|
@ -0,0 +1,40 @@
|
|||
# Lists subgroups of a group or groups of a user
|
||||
package Devscripts::Salsa::list_groups;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub list_groups {
|
||||
my ($self, $match) = @_;
|
||||
my $groups;
|
||||
my $count = 0;
|
||||
my $opts = {
|
||||
order_by => 'name',
|
||||
sort => 'asc',
|
||||
($match ? (search => $match) : ()),
|
||||
};
|
||||
if ($self->group_id) {
|
||||
$groups
|
||||
= $self->api->paginator('group_subgroups', $self->group_id, $opts);
|
||||
} else {
|
||||
$groups = $self->api->paginator('groups', $opts);
|
||||
}
|
||||
while ($_ = $groups->next) {
|
||||
$count++;
|
||||
my $parent = $_->{parent_id} ? "Parent id: $_->{parent_id}\n" : '';
|
||||
print <<END;
|
||||
Id : $_->{id}
|
||||
Name : $_->{name}
|
||||
Full path: $_->{full_path}
|
||||
$parent
|
||||
END
|
||||
}
|
||||
unless ($count) {
|
||||
ds_warn "No groups found";
|
||||
return 1;
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
42
lib/Devscripts/Salsa/list_repos.pm
Normal file
42
lib/Devscripts/Salsa/list_repos.pm
Normal file
|
@ -0,0 +1,42 @@
|
|||
# Lists projects of group/user
|
||||
package Devscripts::Salsa::list_repos; # list_projects
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub list_repos {
|
||||
my ($self, $match) = @_;
|
||||
my $projects;
|
||||
my $count = 0;
|
||||
my $opts = {
|
||||
order_by => 'name',
|
||||
sort => 'asc',
|
||||
simple => 1,
|
||||
archived => $self->config->archived,
|
||||
($match ? (search => $match) : ()),
|
||||
};
|
||||
if ($self->group_id) {
|
||||
$projects
|
||||
= $self->api->paginator('group_projects', $self->group_id, $opts);
|
||||
} else {
|
||||
$projects
|
||||
= $self->api->paginator('user_projects', $self->user_id, $opts);
|
||||
}
|
||||
while ($_ = $projects->next) {
|
||||
$count++;
|
||||
print <<END;
|
||||
Id : $_->{id}
|
||||
Name: $_->{name}
|
||||
URL : $_->{web_url}
|
||||
|
||||
END
|
||||
}
|
||||
unless ($count) {
|
||||
ds_warn "No projects found";
|
||||
return 1;
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
174
lib/Devscripts/Salsa/merge_request.pm
Normal file
174
lib/Devscripts/Salsa/merge_request.pm
Normal file
|
@ -0,0 +1,174 @@
|
|||
# Creates a merge request from current directory (or using parameters)
|
||||
package Devscripts::Salsa::merge_request;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Dpkg::IPC;
|
||||
use Moo::Role;
|
||||
|
||||
with 'Devscripts::Salsa::search_project'; # search_projects
|
||||
|
||||
sub merge_request {
|
||||
my ($self, $dst_project, $dst_branch) = @_;
|
||||
my $src_branch = $self->config->mr_src_branch;
|
||||
my $src_project = $self->config->mr_src_project;
|
||||
$dst_project ||= $self->config->mr_dst_project;
|
||||
$dst_branch ||= $self->config->mr_dst_branch;
|
||||
my $title = $self->config->mr_title;
|
||||
my $desc = $self->config->mr_desc;
|
||||
|
||||
if ($src_branch) {
|
||||
unless ($src_project and $dst_project) {
|
||||
ds_warn "--mr-src-project and --mr-src-project "
|
||||
. "are required when --mr-src-branch is set";
|
||||
return 1;
|
||||
}
|
||||
unless ($src_project =~ m#/#) {
|
||||
$src_project = $self->project2path($src_project);
|
||||
}
|
||||
} else { # Use current repository to find elements
|
||||
ds_verbose "using current branch as source";
|
||||
my $out;
|
||||
unless ($src_project) {
|
||||
# 1. Verify that project is ready
|
||||
spawn(
|
||||
exec => [qw(git status -s -b -uno)],
|
||||
wait_child => 1,
|
||||
to_string => \$out
|
||||
);
|
||||
chomp $out;
|
||||
# Case "rebased"
|
||||
if ($out =~ /\[/) {
|
||||
ds_warn "Current branch isn't pushed, aborting:\n";
|
||||
return 1;
|
||||
}
|
||||
# Case else: nothing after src...dst
|
||||
unless ($out =~ /\s(\S+)\.\.\.(\S+)/s) {
|
||||
ds_warn
|
||||
"Current branch has no origin or isn't pushed, aborting";
|
||||
return 1;
|
||||
}
|
||||
# 2. Set source branch to current branch
|
||||
$src_branch ||= $1;
|
||||
ds_verbose "Found current branch: $src_branch";
|
||||
}
|
||||
unless ($src_project and $dst_project) {
|
||||
# Check remote links
|
||||
spawn(
|
||||
exec => [qw(git remote --verbose show)],
|
||||
wait_child => 1,
|
||||
to_string => \$out,
|
||||
);
|
||||
my $origin = $self->config->api_url;
|
||||
$origin =~ s#api/v4$##;
|
||||
# 3. Set source project using "origin" target
|
||||
unless ($src_project) {
|
||||
if ($out
|
||||
=~ /origin\s+(?:\Q$self->{config}->{git_server_url}\E|\Q$origin\E)(\S*)/m
|
||||
) {
|
||||
$src_project = $1;
|
||||
$src_project =~ s/\.git$//;
|
||||
} else {
|
||||
ds_warn
|
||||
"Unable to find project origin, set it using --mr-src-project";
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
# 4. Steps to find destination project:
|
||||
# - command-line
|
||||
# - GitLab API (search for "forked_from_project"
|
||||
# - "upstream" in git remote
|
||||
# - use source project as destination project
|
||||
|
||||
# 4.1. Stop if dest project has been given in command line
|
||||
unless ($dst_project) {
|
||||
my $project = $self->api->project($src_project);
|
||||
|
||||
# 4.2. Search original project from GitLab API
|
||||
if ($project->{forked_from_project}) {
|
||||
$dst_project
|
||||
= $project->{forked_from_project}->{path_with_namespace};
|
||||
}
|
||||
if ($dst_project) {
|
||||
ds_verbose "Project was forked from $dst_project";
|
||||
|
||||
# 4.3. Search for an "upstream" target in `git remote`
|
||||
} elsif ($out
|
||||
=~ /upstream\s+(?:\Q$self->{config}->{git_server_url}\E|\Q$origin\E)(\S*)/m
|
||||
) {
|
||||
$dst_project = $1;
|
||||
$dst_project =~ s/\.git$//;
|
||||
ds_verbose 'Use "upstream" target as dst project';
|
||||
# 4.4. Use source project as destination
|
||||
} else {
|
||||
ds_warn
|
||||
"No upstream target found, using current project as target";
|
||||
$dst_project = $src_project;
|
||||
}
|
||||
ds_verbose "Use $dst_project as dest project";
|
||||
}
|
||||
}
|
||||
# 5. Search for MR title and desc
|
||||
unless ($title) {
|
||||
ds_warn "Title not set, using last commit";
|
||||
spawn(
|
||||
exec => ['git', 'show', '--format=format:%s###%b'],
|
||||
wait_child => 1,
|
||||
to_string => \$out,
|
||||
);
|
||||
$out =~ s/\ndiff.*$//s;
|
||||
my ($t, $d) = split /###/, $out;
|
||||
chomp $d;
|
||||
$title = $t;
|
||||
ds_verbose "Title set to $title";
|
||||
$desc ||= $d;
|
||||
# Replace all bug links by markdown links
|
||||
if ($desc) {
|
||||
$desc =~ s@#(\d{6,})\b@[#$1](https://bugs.debian.org/$1)@mg;
|
||||
ds_verbose "Desc set to $desc";
|
||||
}
|
||||
}
|
||||
}
|
||||
if ($dst_project eq 'same') {
|
||||
$dst_project = $src_project;
|
||||
}
|
||||
my $src = $self->api->project($src_project);
|
||||
unless ($title) {
|
||||
ds_warn "Title is required";
|
||||
return 1;
|
||||
}
|
||||
unless ($src and $src->{id}) {
|
||||
ds_warn "Target project not found $src_project";
|
||||
return 1;
|
||||
}
|
||||
my $dst;
|
||||
if ($dst_project) {
|
||||
$dst = $self->api->project($dst_project);
|
||||
unless ($dst and $dst->{id}) {
|
||||
ds_warn "Target project not found";
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
return 1
|
||||
if (
|
||||
ds_prompt(
|
||||
"You're going to push an MR to $dst_project:$dst_branch. Continue (Y/n)"
|
||||
) =~ refuse
|
||||
);
|
||||
my $res = $self->api->create_merge_request(
|
||||
$src->{id},
|
||||
{
|
||||
source_branch => $src_branch,
|
||||
target_branch => $dst_branch,
|
||||
title => $title,
|
||||
remove_source_branch => $self->config->mr_remove_source_branch,
|
||||
squash => $self->config->mr_allow_squash,
|
||||
($dst ? (target_project_id => $dst->{id}) : ()),
|
||||
($desc ? (description => $desc) : ()),
|
||||
});
|
||||
ds_warn "MR '$title' posted:";
|
||||
ds_warn $res->{web_url};
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
49
lib/Devscripts/Salsa/merge_requests.pm
Normal file
49
lib/Devscripts/Salsa/merge_requests.pm
Normal file
|
@ -0,0 +1,49 @@
|
|||
# Lists merge requests proposed to a project
|
||||
package Devscripts::Salsa::merge_requests;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub merge_requests {
|
||||
my ($self, @reponames) = @_;
|
||||
my $res = 1;
|
||||
unless (@reponames) {
|
||||
ds_warn "project name is missing";
|
||||
return 1;
|
||||
}
|
||||
foreach my $p (@reponames) {
|
||||
my $id = $self->project2id($p);
|
||||
my $count = 0;
|
||||
unless ($id) {
|
||||
ds_warn "Project $_ not found";
|
||||
return 1;
|
||||
}
|
||||
print "$p\n";
|
||||
my $mrs = $self->api->paginator(
|
||||
'merge_requests',
|
||||
$id,
|
||||
{
|
||||
state => 'opened',
|
||||
});
|
||||
while ($_ = $mrs->next) {
|
||||
$res = 0;
|
||||
my $status = $_->{work_in_progress} ? 'WIP' : $_->{merge_status};
|
||||
print <<END;
|
||||
\tId : $_->{id}
|
||||
\tTitle : $_->{title}
|
||||
\tAuthor: $_->{author}->{username}
|
||||
\tStatus: $status
|
||||
\tUrl : $_->{web_url}
|
||||
|
||||
END
|
||||
}
|
||||
unless ($count) {
|
||||
print "\n";
|
||||
next;
|
||||
}
|
||||
}
|
||||
return $res;
|
||||
}
|
||||
|
||||
1;
|
127
lib/Devscripts/Salsa/pipeline_schedule.pm
Executable file
127
lib/Devscripts/Salsa/pipeline_schedule.pm
Executable file
|
@ -0,0 +1,127 @@
|
|||
# Create a pipeline schedule using parameters
|
||||
package Devscripts::Salsa::pipeline_schedule;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
# For --all
|
||||
with "Devscripts::Salsa::Repo";
|
||||
|
||||
sub pipeline_schedule {
|
||||
my ($self, @repos) = @_;
|
||||
my $ret = 0;
|
||||
my $desc = $self->config->schedule_desc;
|
||||
my $ref = $self->config->schedule_ref;
|
||||
my $cron = $self->config->schedule_cron;
|
||||
my $tz = $self->config->schedule_tz;
|
||||
my $active = $self->config->schedule_enable;
|
||||
$active
|
||||
= ($self->config->schedule_disable)
|
||||
? "0"
|
||||
: $active;
|
||||
my $run = $self->config->schedule_run;
|
||||
my $delete = $self->config->schedule_delete;
|
||||
|
||||
unless (@repos or $self->config->all) {
|
||||
ds_warn "Usage $0 pipeline <project|--all>";
|
||||
return 1;
|
||||
}
|
||||
if (@repos and $self->config->all) {
|
||||
ds_warn "--all with a project (@repos) makes no sense";
|
||||
return 1;
|
||||
}
|
||||
|
||||
unless ($desc) {
|
||||
ds_warn "--schedule-desc / SALSA_SCHEDULE_DESC is missing";
|
||||
ds_warn "Are you looking for: $0 pipelines <project|--all>";
|
||||
return 1;
|
||||
}
|
||||
|
||||
# If --all is asked, launch all projects
|
||||
@repos = map { $_->[1] } $self->get_repo(0, @repos) unless (@repos);
|
||||
|
||||
foreach my $repo (sort @repos) {
|
||||
my $id = $self->project2id($repo);
|
||||
unless ($id) {
|
||||
#ds_warn "Project $repo not found"; # $self->project2id($repo) shows this error
|
||||
$ret++;
|
||||
unless ($self->config->no_fail) {
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
} else {
|
||||
my @pipe_id = ();
|
||||
$desc =~ s/%p/$repo/g;
|
||||
my $options = {};
|
||||
$options->{ref} = $ref if defined $ref;
|
||||
$options->{cron} = $cron if defined $cron;
|
||||
$options->{cron_timezone} = $tz if defined $tz;
|
||||
$options->{active} = $active if defined $active;
|
||||
|
||||
# REF: https://docs.gitlab.com/ee/api/pipeline_schedules.html#get-all-pipeline-schedules
|
||||
# $self->api->pipeline_schedules($id)
|
||||
my $pipelines
|
||||
= $self->api->paginator('pipeline_schedules', $id)->all();
|
||||
ds_verbose "No pipelines scheduled for $repo" unless @$pipelines;
|
||||
|
||||
foreach (@$pipelines) {
|
||||
push @pipe_id, $_->{id}
|
||||
if ($_->{description} eq $desc);
|
||||
}
|
||||
|
||||
ds_warn "More than 1 scheduled pipeline matches: $desc ("
|
||||
. ++$#pipe_id . ")"
|
||||
if ($pipe_id[1]);
|
||||
|
||||
if (!@pipe_id) {
|
||||
ds_warn "--schedule-ref / SALSA_SCHEDULE_REF is required"
|
||||
unless ($ref);
|
||||
ds_warn "--schedule-cron / SALSA_SCHEDULE_CRON is required"
|
||||
unless ($cron);
|
||||
return 1
|
||||
unless ($ref && $cron);
|
||||
|
||||
$options->{description} = $desc if defined $desc;
|
||||
|
||||
ds_verbose "No scheduled pipelines matching: $desc. Creating!";
|
||||
my $schedule
|
||||
= $self->api->create_pipeline_schedule($id, $options);
|
||||
|
||||
@pipe_id = $schedule->{id};
|
||||
} elsif (keys %$options) {
|
||||
ds_verbose "Editing scheduled pipelines matching: $desc";
|
||||
foreach (@pipe_id) {
|
||||
next if !$_;
|
||||
|
||||
my $schedule
|
||||
= $self->api->edit_pipeline_schedule($id, $_, $options);
|
||||
}
|
||||
}
|
||||
|
||||
if ($run) {
|
||||
ds_verbose "Running scheduled pipelines matching: $desc";
|
||||
|
||||
foreach (@pipe_id) {
|
||||
next if !$_;
|
||||
|
||||
my $schedule = $self->api->run_pipeline_schedule($id, $_);
|
||||
}
|
||||
}
|
||||
|
||||
if ($delete) {
|
||||
ds_verbose "Deleting scheduled pipelines matching: $desc";
|
||||
|
||||
foreach (@pipe_id) {
|
||||
next if !$_;
|
||||
|
||||
my $schedule
|
||||
= $self->api->delete_pipeline_schedule($id, $_);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return $ret;
|
||||
}
|
||||
|
||||
1;
|
73
lib/Devscripts/Salsa/pipeline_schedules.pm
Executable file
73
lib/Devscripts/Salsa/pipeline_schedules.pm
Executable file
|
@ -0,0 +1,73 @@
|
|||
# Lists pipeline schedules of a project
|
||||
package Devscripts::Salsa::pipeline_schedules;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
# For --all
|
||||
with "Devscripts::Salsa::Repo";
|
||||
|
||||
sub pipeline_schedules {
|
||||
my ($self, @repo) = @_;
|
||||
my $ret = 0;
|
||||
|
||||
unless (@repo or $self->config->all) {
|
||||
ds_warn "Usage $0 pipelines <project|--all>";
|
||||
return 1;
|
||||
}
|
||||
if (@repo and $self->config->all) {
|
||||
ds_warn "--all with a project (@repo) makes no sense";
|
||||
return 1;
|
||||
}
|
||||
|
||||
# If --all is asked, launch all projects
|
||||
@repo = map { $_->[1] } $self->get_repo(0, @repo) unless (@repo);
|
||||
|
||||
foreach my $p (sort @repo) {
|
||||
my $id = $self->project2id($p);
|
||||
my $count = 0;
|
||||
unless ($id) {
|
||||
#ds_warn "Project $p not found"; # $self->project2id($p) shows this error
|
||||
$ret++;
|
||||
unless ($self->config->no_fail) {
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
} else {
|
||||
my $projects = $self->api->project($id);
|
||||
if ($projects->{jobs_enabled} == 0) {
|
||||
print "$p has disabled CI/CD\n";
|
||||
next;
|
||||
}
|
||||
|
||||
my $pipelines
|
||||
= $self->api->paginator('pipeline_schedules', $id)->all();
|
||||
|
||||
print "$p\n" if @$pipelines;
|
||||
|
||||
foreach (@$pipelines) {
|
||||
my $status = $_->{active} ? 'Enabled' : 'Disabled';
|
||||
print <<END;
|
||||
\tID : $_->{id}
|
||||
\tDescription: $_->{description}
|
||||
\tStatus : $status
|
||||
\tRef : $_->{ref}
|
||||
\tCron : $_->{cron}
|
||||
\tTimezone : $_->{cron_timezone}
|
||||
\tCreated : $_->{created_at}
|
||||
\tUpdated : $_->{updated_at}
|
||||
\tNext run : $_->{next_run_at}
|
||||
\tOwner : $_->{owner}->{username}
|
||||
|
||||
END
|
||||
}
|
||||
}
|
||||
unless ($count) {
|
||||
next;
|
||||
}
|
||||
}
|
||||
return $ret;
|
||||
}
|
||||
|
||||
1;
|
43
lib/Devscripts/Salsa/protect_branch.pm
Normal file
43
lib/Devscripts/Salsa/protect_branch.pm
Normal file
|
@ -0,0 +1,43 @@
|
|||
# Protects a branch
|
||||
package Devscripts::Salsa::protect_branch;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
use constant levels => {
|
||||
o => 50,
|
||||
owner => 50,
|
||||
m => 40,
|
||||
maintainer => 40,
|
||||
d => 30,
|
||||
developer => 30,
|
||||
r => 20,
|
||||
reporter => 20,
|
||||
g => 10,
|
||||
guest => 10,
|
||||
};
|
||||
|
||||
sub protect_branch {
|
||||
my ($self, $reponame, $branch, $merge, $push) = @_;
|
||||
unless ($reponame and $branch) {
|
||||
ds_warn "usage: $0 protect_branch project branch merge push";
|
||||
return 1;
|
||||
}
|
||||
if (defined $merge and $merge =~ /^(?:no|0)$/i) {
|
||||
$self->api->unprotect_branch($self->project2id($reponame), $branch);
|
||||
return 0;
|
||||
}
|
||||
unless (levels->{$merge} and levels->{$push}) {
|
||||
ds_warn
|
||||
"usage: $0 protect_branch project branch <merge level> <push level>";
|
||||
return 1;
|
||||
}
|
||||
my $opts = { name => $branch };
|
||||
$opts->{push_access_level} = (levels->{$push});
|
||||
$opts->{merge_access_level} = (levels->{$merge});
|
||||
$self->api->protect_branch($self->project2id($reponame), $opts);
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
27
lib/Devscripts/Salsa/protected_branches.pm
Normal file
27
lib/Devscripts/Salsa/protected_branches.pm
Normal file
|
@ -0,0 +1,27 @@
|
|||
# Displays protected branches of a project
|
||||
package Devscripts::Salsa::protected_branches;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub protected_branches {
|
||||
my ($self, $reponame) = @_;
|
||||
unless ($reponame) {
|
||||
ds_warn "Project name is missing";
|
||||
return 1;
|
||||
}
|
||||
my $branches
|
||||
= $self->api->protected_branches($self->project2id($reponame));
|
||||
if ($branches and @$branches) {
|
||||
printf " %-20s | %-25s | %-25s\n", 'Branch', 'Merge', 'Push';
|
||||
foreach (@$branches) {
|
||||
printf " %-20s | %-25s | %-25s\n", $_->{name},
|
||||
$_->{merge_access_levels}->[0]->{access_level_description},
|
||||
$_->{push_access_levels}->[0]->{access_level_description};
|
||||
}
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
15
lib/Devscripts/Salsa/purge_cache.pm
Normal file
15
lib/Devscripts/Salsa/purge_cache.pm
Normal file
|
@ -0,0 +1,15 @@
|
|||
# Empties the Devscripts::JSONCache
|
||||
package Devscripts::Salsa::purge_cache;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub purge_cache {
|
||||
my @keys = keys %{ $_[0]->_cache };
|
||||
delete $_[0]->_cache->{$_} foreach (@keys);
|
||||
ds_verbose "Cache empty";
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
106
lib/Devscripts/Salsa/push.pm
Normal file
106
lib/Devscripts/Salsa/push.pm
Normal file
|
@ -0,0 +1,106 @@
|
|||
# Push local work. Like gbp push but able to push incomplete work
|
||||
package Devscripts::Salsa::push;
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Devscripts::Utils;
|
||||
use Dpkg::Source::Format;
|
||||
use Moo::Role;
|
||||
use Dpkg::IPC;
|
||||
|
||||
sub readGbpConf {
|
||||
my ($self) = @_;
|
||||
my $res = '';
|
||||
foreach my $gbpconf (qw(.gbp.conf debian/gbp.conf .git/gbp.conf)) {
|
||||
if (-e $gbpconf) {
|
||||
open(my $f, $gbpconf);
|
||||
while (<$f>) {
|
||||
$res .= $_;
|
||||
if (/^\s*(debian|upstream)\-(branch|tag)\s*=\s*(.*\S)/) {
|
||||
$self->{"$1_$2"} = $3;
|
||||
}
|
||||
}
|
||||
close $f;
|
||||
last;
|
||||
}
|
||||
}
|
||||
if ($self->{debian_tag}) {
|
||||
$self->{debian_tag} =~ s/%\(version\)s/.*/g;
|
||||
$self->{debian_tag} =~ s/^/^/;
|
||||
$self->{debian_tag} =~ s/$/\$/;
|
||||
} else {
|
||||
my @tmp
|
||||
= Dpkg::Source::Format->new(filename => 'debian/source/format')->get;
|
||||
$self->{debian_tag} = $tmp[2] eq 'native' ? '.*' : '^debian/.*$';
|
||||
}
|
||||
if ($self->{upstream_tag}) {
|
||||
$self->{upstream_tag} =~ s/%\(version\)s/.*/g;
|
||||
$self->{upstream_tag} =~ s/^/^/;
|
||||
$self->{upstream_tag} =~ s/$/\$/;
|
||||
} else {
|
||||
$self->{upstream_tag} = '^upstream/.*$';
|
||||
}
|
||||
$self->{debian_branch} ||= 'master';
|
||||
$self->{upstream_branch} ||= 'upstream';
|
||||
return $res;
|
||||
}
|
||||
|
||||
sub push {
|
||||
my ($self) = @_;
|
||||
$self->readGbpConf;
|
||||
my @refs;
|
||||
foreach (
|
||||
$self->{debian_branch}, $self->{upstream_branch},
|
||||
'pristine-tar', 'refs/notes/commits'
|
||||
) {
|
||||
if (ds_exec_no_fail(qw(git rev-parse --verify --quiet), $_) == 0) {
|
||||
push @refs, $_;
|
||||
}
|
||||
}
|
||||
my $out;
|
||||
spawn(exec => ['git', 'tag'], wait_child => 1, to_string => \$out);
|
||||
my @tags = grep /(?:$self->{debian_tag}|$self->{upstream_tag})/,
|
||||
split(/\r?\n/, $out);
|
||||
unless (
|
||||
$ds_yes < 0
|
||||
and ds_prompt(
|
||||
"You're going to push :\n - "
|
||||
. join(', ', @refs)
|
||||
. "\nand check tags that match:\n - "
|
||||
. join(', ', $self->{debian_tag}, $self->{upstream_tag})
|
||||
. "\nContinue (Y/n) "
|
||||
) =~ refuse
|
||||
) {
|
||||
my $origin;
|
||||
eval {
|
||||
spawn(
|
||||
exec => ['git', 'rev-parse', '--abbrev-ref', 'HEAD'],
|
||||
wait_child => 1,
|
||||
to_string => \$out,
|
||||
);
|
||||
chomp $out;
|
||||
spawn(
|
||||
exec =>
|
||||
['git', 'config', '--local', '--get', "branch.$out.remote"],
|
||||
wait_child => 1,
|
||||
to_string => \$origin,
|
||||
);
|
||||
chomp $origin;
|
||||
};
|
||||
if ($origin) {
|
||||
ds_verbose 'Origin is ' . $origin;
|
||||
} else {
|
||||
ds_warn 'Unable to detect remote name, trying "origin"';
|
||||
ds_verbose "Error: $@" if ($@);
|
||||
$origin = 'origin';
|
||||
}
|
||||
ds_verbose "Execute 'git push $origin " . join(' ', @refs, '<tags>');
|
||||
ds_debug "Tags are: " . join(' ', @tags);
|
||||
spawn(
|
||||
exec => ['git', 'push', $origin, @refs, @tags],
|
||||
wait_child => 1
|
||||
);
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
71
lib/Devscripts/Salsa/push_repo.pm
Normal file
71
lib/Devscripts/Salsa/push_repo.pm
Normal file
|
@ -0,0 +1,71 @@
|
|||
# Creates GitLab project from local repository path
|
||||
package Devscripts::Salsa::push_repo;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Dpkg::IPC;
|
||||
use Moo::Role;
|
||||
|
||||
with "Devscripts::Salsa::create_repo"; # create_project
|
||||
|
||||
sub push_repo {
|
||||
my ($self, $reponame) = @_;
|
||||
unless ($reponame) {
|
||||
ds_warn "Repository path is missing";
|
||||
return 1;
|
||||
}
|
||||
unless (-d $reponame) {
|
||||
ds_warn "$reponame isn't a directory";
|
||||
return 1;
|
||||
}
|
||||
chdir $reponame;
|
||||
eval {
|
||||
spawn(
|
||||
exec => ['dpkg-parsechangelog', '--show-field', 'Source'],
|
||||
to_string => \$reponame,
|
||||
wait_child => 1,
|
||||
);
|
||||
};
|
||||
if ($@) {
|
||||
ds_warn $@;
|
||||
return 1;
|
||||
}
|
||||
chomp $reponame;
|
||||
my $out;
|
||||
spawn(
|
||||
exec => ['git', 'remote', 'show'],
|
||||
to_string => \$out,
|
||||
wait_child => 1,
|
||||
);
|
||||
if ($out =~ /^origin$/m) {
|
||||
ds_warn "git origin is already configured:\n$out";
|
||||
return 1;
|
||||
}
|
||||
my $path = $self->project2path('') or return 1;
|
||||
my $url = $self->config->git_server_url . "$path$reponame";
|
||||
spawn(
|
||||
exec => ['git', 'remote', 'add', 'origin', $url],
|
||||
wait_child => 1,
|
||||
);
|
||||
my $res = $self->create_repo($reponame);
|
||||
if ($res) {
|
||||
return 1
|
||||
unless (
|
||||
ds_prompt(
|
||||
"Project already exists, do you want to try to push local repository? (y/N) "
|
||||
) =~ accept
|
||||
);
|
||||
}
|
||||
spawn(
|
||||
exec =>
|
||||
['git', 'push', '--all', '--verbose', '--set-upstream', 'origin'],
|
||||
wait_child => 1,
|
||||
);
|
||||
spawn(
|
||||
exec => ['git', 'push', '--tags', '--verbose', 'origin'],
|
||||
wait_child => 1,
|
||||
);
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
47
lib/Devscripts/Salsa/rename_branch.pm
Normal file
47
lib/Devscripts/Salsa/rename_branch.pm
Normal file
|
@ -0,0 +1,47 @@
|
|||
package Devscripts::Salsa::rename_branch;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
with "Devscripts::Salsa::Repo";
|
||||
|
||||
our $prompt = 1;
|
||||
|
||||
sub rename_branch {
|
||||
my ($self, @reponames) = @_;
|
||||
my $res = 0;
|
||||
my @repos = $self->get_repo($prompt, @reponames);
|
||||
return @repos unless (ref $repos[0]); # get_repo returns 1 when fails
|
||||
foreach (@repos) {
|
||||
my $id = $_->[0];
|
||||
my $str = $_->[1];
|
||||
if (!$id) {
|
||||
ds_warn "Branch rename has failed for $str (missing ID)\n";
|
||||
return 1;
|
||||
}
|
||||
ds_verbose "Configuring $str";
|
||||
my $project = $self->api->project($id);
|
||||
eval {
|
||||
$self->api->create_branch(
|
||||
$id,
|
||||
{
|
||||
ref => $self->config->source_branch,
|
||||
branch => $self->config->dest_branch,
|
||||
});
|
||||
$self->api->delete_branch($id, $self->config->source_branch);
|
||||
};
|
||||
if ($@) {
|
||||
ds_warn "Branch rename has failed for $str\n";
|
||||
ds_verbose $@;
|
||||
unless ($self->config->no_fail) {
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
next;
|
||||
}
|
||||
}
|
||||
return $res;
|
||||
}
|
||||
|
||||
1;
|
37
lib/Devscripts/Salsa/search_group.pm
Normal file
37
lib/Devscripts/Salsa/search_group.pm
Normal file
|
@ -0,0 +1,37 @@
|
|||
# Searches groups using given string
|
||||
package Devscripts::Salsa::search_group; # search_groups
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub search_group {
|
||||
my ($self, $group) = @_;
|
||||
unless ($group) {
|
||||
ds_warn "Searched string is missing";
|
||||
return 1;
|
||||
}
|
||||
my $groups = $self->api->group_without_projects($group);
|
||||
if ($groups) {
|
||||
$groups = [$groups];
|
||||
} else {
|
||||
$groups = $self->api->paginator('groups',
|
||||
{ search => $group, order_by => 'name' })->all;
|
||||
}
|
||||
unless ($groups and @$groups) {
|
||||
ds_warn "No group found";
|
||||
return 1;
|
||||
}
|
||||
foreach (@$groups) {
|
||||
print <<END;
|
||||
Id : $_->{id}
|
||||
Name : $_->{name}
|
||||
Full name: $_->{full_name}
|
||||
Full path: $_->{full_path}
|
||||
|
||||
END
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
57
lib/Devscripts/Salsa/search_project.pm
Normal file
57
lib/Devscripts/Salsa/search_project.pm
Normal file
|
@ -0,0 +1,57 @@
|
|||
# Searches projects using given string
|
||||
package Devscripts::Salsa::search_project; # search_projects
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub search_project {
|
||||
my ($self, $project) = @_;
|
||||
unless ($project) {
|
||||
ds_warn "Searched string is missing";
|
||||
return 1;
|
||||
}
|
||||
my $projects = $self->api->project($project);
|
||||
if ($projects) {
|
||||
$projects = [$projects];
|
||||
} else {
|
||||
$projects = $self->api->paginator(
|
||||
'projects',
|
||||
{
|
||||
search => $project,
|
||||
order_by => 'name',
|
||||
archived => $self->config->archived
|
||||
})->all();
|
||||
}
|
||||
unless ($projects and @$projects) {
|
||||
ds_warn "No projects found";
|
||||
return 1;
|
||||
}
|
||||
foreach (@$projects) {
|
||||
print <<END;
|
||||
Id : $_->{id}
|
||||
Name : $_->{name}
|
||||
Full path: $_->{path_with_namespace}
|
||||
END
|
||||
print(
|
||||
$_->{namespace}->{kind} eq 'group'
|
||||
? "Group id : "
|
||||
: "User id : "
|
||||
);
|
||||
print "$_->{namespace}->{id}\n";
|
||||
print(
|
||||
$_->{namespace}->{kind} eq 'group'
|
||||
? "Group : "
|
||||
: "User : "
|
||||
);
|
||||
print "$_->{namespace}->{name}\n";
|
||||
if ($_->{forked_from_project} and $_->{forked_from_project}->{id}) {
|
||||
print
|
||||
"Fork of : $_->{forked_from_project}->{name_with_namespace}\n";
|
||||
}
|
||||
print "\n";
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
36
lib/Devscripts/Salsa/search_user.pm
Normal file
36
lib/Devscripts/Salsa/search_user.pm
Normal file
|
@ -0,0 +1,36 @@
|
|||
# Searches users using given string
|
||||
package Devscripts::Salsa::search_user; # search_users
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub search_user {
|
||||
my ($self, $user) = @_;
|
||||
unless ($user) {
|
||||
ds_warn "User name is missing";
|
||||
return 1;
|
||||
}
|
||||
my $users = $self->api->user($user);
|
||||
if ($users) {
|
||||
$users = [$users];
|
||||
} else {
|
||||
$users = $self->api->paginator('users', { search => $user })->all();
|
||||
}
|
||||
unless ($users and @$users) {
|
||||
ds_warn "No user found";
|
||||
return 1;
|
||||
}
|
||||
foreach (@$users) {
|
||||
print <<END;
|
||||
Id : $_->{id}
|
||||
Username : $_->{username}
|
||||
Name : $_->{name}
|
||||
State : $_->{state}
|
||||
|
||||
END
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
137
lib/Devscripts/Salsa/update_repo.pm
Executable file
137
lib/Devscripts/Salsa/update_repo.pm
Executable file
|
@ -0,0 +1,137 @@
|
|||
# Updates projects
|
||||
package Devscripts::Salsa::update_repo; # update_projects
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use GitLab::API::v4::Constants qw(:all);
|
||||
use Moo::Role;
|
||||
|
||||
with "Devscripts::Salsa::Repo";
|
||||
|
||||
our $prompt = 1;
|
||||
|
||||
sub update_repo {
|
||||
my ($self, @reponames) = @_;
|
||||
if ($ds_yes < 0 and $self->config->command eq 'update_repo') {
|
||||
ds_warn
|
||||
"update_projects can't be launched when --info is set, use update_safe";
|
||||
return 1;
|
||||
}
|
||||
unless (@reponames or $self->config->all or $self->config->all_archived) {
|
||||
ds_warn "Usage $0 update_projects <--all|--all-archived|names>";
|
||||
return 1;
|
||||
}
|
||||
if (@reponames and $self->config->all) {
|
||||
ds_warn "--all with a project name makes no sense";
|
||||
return 1;
|
||||
}
|
||||
if (@reponames and $self->config->all_archived) {
|
||||
ds_warn "--all-archived with a project name makes no sense";
|
||||
return 1;
|
||||
}
|
||||
return $self->_update_repo(@reponames);
|
||||
}
|
||||
|
||||
sub _update_repo {
|
||||
my ($self, @reponames) = @_;
|
||||
my $res = 0;
|
||||
# Common options
|
||||
my $configparams = {};
|
||||
# visibility can be modified only by group owners
|
||||
$configparams->{visibility} = 'public'
|
||||
if $self->access_level >= $GITLAB_ACCESS_LEVEL_OWNER;
|
||||
# get project list using Devscripts::Salsa::Repo
|
||||
my @repos = $self->get_repo($prompt, @reponames);
|
||||
return @repos unless (ref $repos[0]); # get_repo returns 1 when fails
|
||||
foreach my $repo (@repos) {
|
||||
my $id = $repo->[0];
|
||||
my $str = $repo->[1];
|
||||
ds_verbose "Configuring $str";
|
||||
eval {
|
||||
# apply new parameters
|
||||
$self->api->edit_project($id,
|
||||
{ %$configparams, $self->desc($str) });
|
||||
# Set project avatar
|
||||
my @avatar_file = $self->desc_multipart($str);
|
||||
$self->api->edit_project_multipart($id, {@avatar_file})
|
||||
if (@avatar_file and $self->config->avatar_path);
|
||||
# add hooks if needed
|
||||
$str =~ s#^.*/##;
|
||||
$self->add_hooks($id, $str);
|
||||
};
|
||||
if ($@) {
|
||||
ds_warn "update_projects has failed for $str\n";
|
||||
ds_verbose $@;
|
||||
$res++;
|
||||
unless ($self->config->no_fail) {
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
next;
|
||||
} elsif ($self->config->rename_head) {
|
||||
# 1 - creates new branch if --rename-head
|
||||
my $project = $self->api->project($id);
|
||||
if ($project->{default_branch} ne $self->config->dest_branch) {
|
||||
eval {
|
||||
$self->api->create_branch(
|
||||
$id,
|
||||
{
|
||||
ref => $self->config->source_branch,
|
||||
branch => $self->config->dest_branch,
|
||||
});
|
||||
};
|
||||
if ($@) {
|
||||
ds_debug $@ if ($@);
|
||||
$project = undef;
|
||||
}
|
||||
|
||||
eval {
|
||||
$self->api->edit_project($id,
|
||||
{ default_branch => $self->config->dest_branch });
|
||||
# delete old branch only if "create_branch" succeed
|
||||
if ($project) {
|
||||
$self->api->delete_branch($id,
|
||||
$self->config->source_branch);
|
||||
}
|
||||
};
|
||||
if ($@) {
|
||||
ds_warn "Branch rename has failed for $str\n";
|
||||
ds_verbose $@;
|
||||
$res++;
|
||||
unless ($self->config->no_fail) {
|
||||
ds_verbose "Use --no-fail to continue";
|
||||
return 1;
|
||||
}
|
||||
next;
|
||||
}
|
||||
} else {
|
||||
ds_verbose "Head already renamed for $str";
|
||||
}
|
||||
}
|
||||
ds_verbose "Project $str updated";
|
||||
}
|
||||
return $res;
|
||||
}
|
||||
|
||||
sub access_level {
|
||||
my ($self) = @_;
|
||||
my $user_id = $self->api->current_user()->{id};
|
||||
if ($self->group_id) {
|
||||
my $tmp = $self->api->all_group_members($self->group_id,
|
||||
{ user_ids => $user_id });
|
||||
unless ($tmp) {
|
||||
my $members
|
||||
= $self->api->paginator('all_group_members', $self->group_id,
|
||||
{ query => $user_id });
|
||||
while ($_ = $members->next) {
|
||||
return $_->{access_level} if ($_->{id} eq $user_id);
|
||||
}
|
||||
ds_warn "You're not member of this group";
|
||||
return 0;
|
||||
}
|
||||
return $tmp->[0]->{access_level};
|
||||
}
|
||||
return $GITLAB_ACCESS_LEVEL_OWNER;
|
||||
}
|
||||
|
||||
1;
|
22
lib/Devscripts/Salsa/update_safe.pm
Normal file
22
lib/Devscripts/Salsa/update_safe.pm
Normal file
|
@ -0,0 +1,22 @@
|
|||
# launches check_projects and launch update_projects if user agrees with this changes
|
||||
package Devscripts::Salsa::update_safe;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
with 'Devscripts::Salsa::check_repo'; # check_projects
|
||||
with 'Devscripts::Salsa::update_repo'; # update_projects
|
||||
|
||||
sub update_safe {
|
||||
my $self = shift;
|
||||
my ($res, $fails) = $self->_check_repo(@_);
|
||||
return 0 unless ($res);
|
||||
return $res
|
||||
if (ds_prompt("$res projects misconfigured, update them ? (Y/n) ")
|
||||
=~ refuse);
|
||||
$Devscripts::Salsa::update_repo::prompt = 0;
|
||||
return $self->_update_repo(@$fails);
|
||||
}
|
||||
|
||||
1;
|
38
lib/Devscripts/Salsa/update_user.pm
Normal file
38
lib/Devscripts/Salsa/update_user.pm
Normal file
|
@ -0,0 +1,38 @@
|
|||
# Updates user role in a group
|
||||
package Devscripts::Salsa::update_user;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub update_user {
|
||||
my ($self, $level, $user) = @_;
|
||||
unless ($level and $user) {
|
||||
ds_warn "Usage $0 update_user <level> <userid>";
|
||||
return 1;
|
||||
}
|
||||
unless ($self->group_id) {
|
||||
ds_warn "Unable to update user without --group-id";
|
||||
return 1;
|
||||
}
|
||||
|
||||
my $id = $self->username2id($user);
|
||||
my $al = $self->levels_name($level);
|
||||
return 1
|
||||
if (
|
||||
$ds_yes < 0
|
||||
and ds_prompt(
|
||||
"You're going to accept $user as $level in group $self->{group_id}. Continue (Y/n) "
|
||||
) =~ refuse
|
||||
);
|
||||
$self->api->update_group_member(
|
||||
$self->group_id,
|
||||
$id,
|
||||
{
|
||||
access_level => $al,
|
||||
});
|
||||
ds_warn "User $user removed from group " . $self->group_id;
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
24
lib/Devscripts/Salsa/whoami.pm
Normal file
24
lib/Devscripts/Salsa/whoami.pm
Normal file
|
@ -0,0 +1,24 @@
|
|||
# Gives information on token owner
|
||||
package Devscripts::Salsa::whoami;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Moo::Role;
|
||||
|
||||
sub whoami {
|
||||
my ($self) = @_;
|
||||
my $current_user = $self->api->current_user;
|
||||
print <<END;
|
||||
Id : $current_user->{id}
|
||||
Username: $current_user->{username}
|
||||
Name : $current_user->{name}
|
||||
Email : $current_user->{email}
|
||||
State : $current_user->{state}
|
||||
END
|
||||
$self->cache->{user}->{ $current_user->{id} } = $current_user->{username};
|
||||
$self->cache->{user_id}->{ $current_user->{username} }
|
||||
= $current_user->{id};
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
126
lib/Devscripts/Set.pm
Normal file
126
lib/Devscripts/Set.pm
Normal file
|
@ -0,0 +1,126 @@
|
|||
# Copyright Bill Allombert <ballombe@debian.org> 2001.
|
||||
# Modifications copyright 2002 Julian Gilbey <jdg@debian.org>
|
||||
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation; either version 2 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
package Devscripts::Set;
|
||||
|
||||
use strict;
|
||||
|
||||
BEGIN {
|
||||
use Exporter ();
|
||||
use vars qw(@EXPORT @ISA %EXPORT_TAGS);
|
||||
@EXPORT = qw(SetMinus SetInter SetUnion);
|
||||
@ISA = qw(Exporter);
|
||||
%EXPORT_TAGS = ();
|
||||
}
|
||||
|
||||
# Several routines to work with arrays whose elements are unique
|
||||
# (here called sets)
|
||||
|
||||
=head1 NAME
|
||||
|
||||
Devscripts::Set - Functions for handling sets.
|
||||
|
||||
=head1 SYNOPSIS
|
||||
|
||||
use Devscripts::Set;
|
||||
|
||||
@set=ListToSet(@list);
|
||||
|
||||
@setdiff=SetMinus(\@set1,\@set2);
|
||||
|
||||
@setinter=SetInter(\@set1,\@set2);
|
||||
|
||||
@setunion=SetUnion(\@set1,\@set2);
|
||||
|
||||
=head1 DESCRIPTION
|
||||
|
||||
ListToSet: Make a set (array with duplicates removed) from a list of
|
||||
items given by an array.
|
||||
|
||||
SetMinus, SetInter, SetUnion: Compute the set theoretic difference,
|
||||
intersection, union of two sets given as arrays.
|
||||
|
||||
=cut
|
||||
|
||||
# Transforms a list to a set, removing duplicates
|
||||
# input: list
|
||||
# output: set
|
||||
|
||||
sub ListToSet (@) {
|
||||
my %items;
|
||||
|
||||
grep $items{$_}++, @_;
|
||||
|
||||
return keys %items;
|
||||
}
|
||||
|
||||
# Compute the set-theoretic difference of two sets.
|
||||
# input: ref to Set 1, ref to Set 2
|
||||
# output: set
|
||||
|
||||
sub SetMinus ($$) {
|
||||
my ($set1, $set2) = @_;
|
||||
my %items;
|
||||
|
||||
grep $items{$_}++, @$set1;
|
||||
grep $items{$_}--, @$set2;
|
||||
|
||||
return grep $items{$_} > 0, keys %items;
|
||||
}
|
||||
|
||||
# Compute the set-theoretic intersection of two sets.
|
||||
# input: ref to Set 1, ref to Set 2
|
||||
# output: set
|
||||
|
||||
sub SetInter ($$) {
|
||||
my ($set1, $set2) = @_;
|
||||
my %items;
|
||||
|
||||
grep $items{$_}++, @$set1;
|
||||
grep $items{$_}++, @$set2;
|
||||
|
||||
return grep $items{$_} == 2, keys %items;
|
||||
}
|
||||
|
||||
#Compute the set-theoretic union of two sets.
|
||||
#input: ref to Set 1, ref to Set 2
|
||||
#output: set
|
||||
|
||||
sub SetUnion ($$) {
|
||||
my ($set1, $set2) = @_;
|
||||
my %items;
|
||||
|
||||
grep $items{$_}++, @$set1;
|
||||
grep $items{$_}++, @$set2;
|
||||
|
||||
return grep $items{$_} > 0, keys %items;
|
||||
}
|
||||
|
||||
1;
|
||||
|
||||
=head1 AUTHOR
|
||||
|
||||
Bill Allombert <ballombe@debian.org>
|
||||
|
||||
=head1 COPYING
|
||||
|
||||
Copyright 2001 Bill Allombert <ballombe@debian.org>
|
||||
Modifications Copyright 2002 Julian Gilbey <jdg@debian.org>
|
||||
dpkg-depcheck is free software, covered by the GNU General Public License, and
|
||||
you are welcome to change it and/or distribute copies of it under
|
||||
certain conditions. There is absolutely no warranty for dpkg-depcheck.
|
||||
|
||||
=cut
|
27
lib/Devscripts/Uscan/CatchRedirections.pm
Normal file
27
lib/Devscripts/Uscan/CatchRedirections.pm
Normal file
|
@ -0,0 +1,27 @@
|
|||
# dummy subclass used to store all the redirections for later use
|
||||
package Devscripts::Uscan::CatchRedirections;
|
||||
|
||||
use parent qw(LWP::UserAgent);
|
||||
|
||||
my @uscan_redirections;
|
||||
|
||||
sub redirect_ok {
|
||||
my $self = shift;
|
||||
my ($request) = @_;
|
||||
if ($self->SUPER::redirect_ok(@_)) {
|
||||
push @uscan_redirections, $request->uri;
|
||||
return 1;
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
sub get_redirections {
|
||||
return \@uscan_redirections;
|
||||
}
|
||||
|
||||
sub clear_redirections {
|
||||
undef @uscan_redirections;
|
||||
return;
|
||||
}
|
||||
|
||||
1;
|
394
lib/Devscripts/Uscan/Config.pm
Normal file
394
lib/Devscripts/Uscan/Config.pm
Normal file
|
@ -0,0 +1,394 @@
|
|||
|
||||
=head1 NAME
|
||||
|
||||
Devscripts::Uscan::Config - uscan configuration object
|
||||
|
||||
=head1 SYNOPSIS
|
||||
|
||||
use Devscripts::Uscan::Config;
|
||||
my $config = Devscripts::Uscan::Config->new->parse;
|
||||
|
||||
=head1 DESCRIPTION
|
||||
|
||||
Uscan configuration object. It can scan configuration files
|
||||
(B</etc/devscripts.conf> and B<~/.devscripts>) and command line arguments.
|
||||
|
||||
=cut
|
||||
|
||||
package Devscripts::Uscan::Config;
|
||||
|
||||
use strict;
|
||||
|
||||
use Devscripts::Uscan::Output;
|
||||
use Exporter 'import';
|
||||
use Moo;
|
||||
|
||||
extends 'Devscripts::Config';
|
||||
|
||||
our $CURRENT_WATCHFILE_VERSION = 4;
|
||||
|
||||
use constant default_user_agent => "Debian uscan"
|
||||
. ($main::uscan_version ? " $main::uscan_version" : '');
|
||||
|
||||
our @EXPORT = (qw($CURRENT_WATCHFILE_VERSION));
|
||||
|
||||
# I - ACCESSORS
|
||||
|
||||
# Options + default values
|
||||
|
||||
has bare => (is => 'rw');
|
||||
has check_dirname_level => (is => 'rw');
|
||||
has check_dirname_regex => (is => 'rw');
|
||||
has compression => (is => 'rw');
|
||||
has copyright_file => (is => 'rw');
|
||||
has destdir => (is => 'rw');
|
||||
has download => (is => 'rw');
|
||||
has download_current_version => (is => 'rw');
|
||||
has download_debversion => (is => 'rw');
|
||||
has download_version => (is => 'rw');
|
||||
has exclusion => (is => 'rw');
|
||||
has log => (is => 'rw');
|
||||
has orig => (is => 'rw');
|
||||
has package => (is => 'rw');
|
||||
has pasv => (is => 'rw');
|
||||
has http_header => (is => 'rw', default => sub { {} });
|
||||
|
||||
# repack to .tar.$zsuffix if 1
|
||||
has repack => (is => 'rw');
|
||||
has safe => (is => 'rw');
|
||||
has signature => (is => 'rw');
|
||||
has symlink => (is => 'rw');
|
||||
has timeout => (is => 'rw');
|
||||
has user_agent => (is => 'rw');
|
||||
has uversion => (is => 'rw');
|
||||
has vcs_export_uncompressed => (is => 'rw');
|
||||
has watchfile => (is => 'rw');
|
||||
|
||||
# II - Options
|
||||
|
||||
use constant keys => [
|
||||
# 2.1 - Simple parameters that can be set in ~/.devscripts and command line
|
||||
[
|
||||
'check-dirname-level=s', 'DEVSCRIPTS_CHECK_DIRNAME_LEVEL',
|
||||
qr/^[012]$/, 1
|
||||
],
|
||||
[
|
||||
'check-dirname-regex=s', 'DEVSCRIPTS_CHECK_DIRNAME_REGEX',
|
||||
undef, 'PACKAGE(-.+)?'
|
||||
],
|
||||
['dehs!', 'USCAN_DEHS_OUTPUT', sub { $dehs = $_[1]; 1 }],
|
||||
[
|
||||
'destdir=s',
|
||||
'USCAN_DESTDIR',
|
||||
sub {
|
||||
if (-d $_[1]) {
|
||||
$_[0]->destdir($_[1]) if (-d $_[1]);
|
||||
return 1;
|
||||
}
|
||||
return (0,
|
||||
"The directory to store downloaded files(\$destdir): $_[1]");
|
||||
},
|
||||
'..'
|
||||
],
|
||||
['exclusion!', 'USCAN_EXCLUSION', 'bool', 1],
|
||||
['timeout=i', 'USCAN_TIMEOUT', qr/^\d+$/, 20],
|
||||
[
|
||||
'user-agent|useragent=s',
|
||||
'USCAN_USER_AGENT',
|
||||
qr/\w/,
|
||||
sub {
|
||||
default_user_agent;
|
||||
}
|
||||
],
|
||||
['repack', 'USCAN_REPACK', 'bool'],
|
||||
# 2.2 - Simple command line args
|
||||
['bare', undef, 'bool', 0],
|
||||
['compression=s'],
|
||||
['copyright-file=s'],
|
||||
['download-current-version', undef, 'bool'],
|
||||
['download-version=s'],
|
||||
['download-debversion|dversion=s'],
|
||||
['log', undef, 'bool'],
|
||||
['package=s'],
|
||||
['uversion|upstream-version=s'],
|
||||
['vcs-export-uncompressed', 'USCAN_VCS_EXPORT_UNCOMPRESSED', 'bool'],
|
||||
['watchfile=s'],
|
||||
# 2.3 - More complex options
|
||||
# http headers (#955268)
|
||||
['http-header=s', 'USCAN_HTTP_HEADER', undef, sub { {} }],
|
||||
|
||||
# "download" and its aliases
|
||||
[
|
||||
undef,
|
||||
'USCAN_DOWNLOAD',
|
||||
sub {
|
||||
return (1, 'Bad USCAN_DOWNLOAD value, skipping')
|
||||
unless ($_[1] =~ /^(?:yes|(no))$/i);
|
||||
$_[0]->download(0) if $1;
|
||||
return 1;
|
||||
}
|
||||
],
|
||||
[
|
||||
'download|d+',
|
||||
undef,
|
||||
sub {
|
||||
$_[1] =~ s/^yes$/1/i;
|
||||
$_[1] =~ s/^no$/0/i;
|
||||
return (0, "Wrong number of -d")
|
||||
unless ($_[1] =~ /^[0123]$/);
|
||||
$_[0]->download($_[1]);
|
||||
return 1;
|
||||
},
|
||||
1
|
||||
],
|
||||
[
|
||||
'force-download',
|
||||
undef,
|
||||
sub {
|
||||
$_[0]->download(2);
|
||||
}
|
||||
],
|
||||
['no-download', undef, sub { $_[0]->download(0); return 1; }],
|
||||
['overwrite-download', undef, sub { $_[0]->download(3) }],
|
||||
|
||||
# "pasv"
|
||||
[
|
||||
'pasv|passive',
|
||||
'USCAN_PASV',
|
||||
sub {
|
||||
return $_[0]->pasv('default')
|
||||
unless ($_[1] =~ /^(yes|0|1|no)$/);
|
||||
$_[0]->pasv({
|
||||
yes => 1,
|
||||
1 => 1,
|
||||
no => 0,
|
||||
0 => 0,
|
||||
}->{$1});
|
||||
return 1;
|
||||
},
|
||||
0
|
||||
],
|
||||
|
||||
# "safe" and "symlink" and their aliases
|
||||
['safe|report', 'USCAN_SAFE', 'bool', 0],
|
||||
[
|
||||
'report-status',
|
||||
undef,
|
||||
sub {
|
||||
$_[0]->safe(1);
|
||||
$verbose ||= 1;
|
||||
}
|
||||
],
|
||||
['copy', undef, sub { $_[0]->symlink('copy') }],
|
||||
['rename', undef, sub { $_[0]->symlink('rename') if ($_[1]); 1; }],
|
||||
[
|
||||
'symlink!',
|
||||
'USCAN_SYMLINK',
|
||||
sub {
|
||||
$_[0]->symlink(
|
||||
$_[1] =~ /^(no|0|rename)$/ ? $1
|
||||
: $_[1] =~ /^(yes|1|symlink)$/ ? 'symlink'
|
||||
: 'no'
|
||||
);
|
||||
return 1;
|
||||
},
|
||||
'symlink'
|
||||
],
|
||||
# "signature" and its aliases
|
||||
['signature!', undef, 'bool', 1],
|
||||
['skipsignature|skip-signature', undef, sub { $_[0]->signature(-1) }],
|
||||
# "verbose" and its aliases
|
||||
['debug', undef, sub { $verbose = 2 }],
|
||||
['extra-debug', undef, sub { $verbose = 3 }],
|
||||
['no-verbose', undef, sub { $verbose = 0; return 1; }],
|
||||
[
|
||||
'verbose|v+',
|
||||
'USCAN_VERBOSE',
|
||||
sub {
|
||||
$verbose = ($_[1] =~ /^yes$/i ? 1 : $_[1] =~ /^(\d)$/ ? $1 : 0);
|
||||
return 1;
|
||||
}
|
||||
],
|
||||
# Display version
|
||||
[
|
||||
'version',
|
||||
undef,
|
||||
sub {
|
||||
if ($_[1]) { $_[0]->version; exit 0 }
|
||||
}
|
||||
]];
|
||||
|
||||
use constant rules => [
|
||||
sub {
|
||||
my $self = shift;
|
||||
if ($self->package) {
|
||||
$self->download(0)
|
||||
unless ($self->download > 1); # compatibility
|
||||
return (0,
|
||||
"The --package option requires to set the --watchfile option, too."
|
||||
) unless defined $self->watchfile;
|
||||
}
|
||||
$self->download(0) if ($self->safe == 1 and $self->download == 1);
|
||||
return 1;
|
||||
},
|
||||
# $signature: -1 = no downloading signature and no verifying signature,
|
||||
# 0 = no downloading signature but verifying signature,
|
||||
# 1 = downloading signature and verifying signature
|
||||
sub {
|
||||
my $self = shift;
|
||||
$self->signature(-1)
|
||||
if $self->download == 0; # Change default 1 -> -1
|
||||
return 1;
|
||||
},
|
||||
sub {
|
||||
if (defined $_[0]->watchfile and @ARGV) {
|
||||
return (0, "Can't have directory arguments if using --watchfile");
|
||||
}
|
||||
return 1;
|
||||
},
|
||||
];
|
||||
|
||||
# help methods
|
||||
sub usage {
|
||||
my ($self) = @_;
|
||||
print <<"EOF";
|
||||
Usage: $progname [options] [dir ...]
|
||||
Process watch files in all .../debian/ subdirs of those listed (or the
|
||||
current directory if none listed) to check for upstream releases.
|
||||
Options:
|
||||
--no-conf, --noconf
|
||||
Don\'t read devscripts config files;
|
||||
must be the first option given
|
||||
--no-verbose Don\'t report verbose information.
|
||||
--verbose, -v Report verbose information.
|
||||
--debug, -vv Report verbose information including the downloaded
|
||||
web pages as processed to STDERR for debugging.
|
||||
--extra-debug, -vvv Report also remote content during "search" step
|
||||
--dehs Send DEHS style output (XML-type) to STDOUT, while
|
||||
send all other uscan output to STDERR.
|
||||
--no-dehs Use only traditional uscan output format (default)
|
||||
--download, -d
|
||||
Download the new upstream release (default)
|
||||
--force-download, -dd
|
||||
Download the new upstream release, even if up-to-date
|
||||
(may not overwrite the local file)
|
||||
--overwrite-download, -ddd
|
||||
Download the new upstream release, even if up-to-date
|
||||
(may overwrite the local file)
|
||||
--no-download, --nodownload
|
||||
Don\'t download and report information.
|
||||
Previously downloaded tarballs may be used.
|
||||
Change default to --skip-signature.
|
||||
--signature Download signature and verify (default)
|
||||
--no-signature Don\'t download signature but verify if already downloaded.
|
||||
--skip-signature
|
||||
Don\'t bother download signature nor verify it.
|
||||
--safe, --report
|
||||
avoid running unsafe scripts by skipping both the repacking
|
||||
of the downloaded package and the updating of the new
|
||||
source tree. Change default to --no-download and
|
||||
--skip-signature.
|
||||
--report-status (= --safe --verbose)
|
||||
--download-version VERSION
|
||||
Specify the version which the upstream release must
|
||||
match in order to be considered, rather than using the
|
||||
release with the highest version
|
||||
--download-debversion VERSION
|
||||
Specify the Debian package version to download the
|
||||
corresponding upstream release version. The
|
||||
dversionmangle and uversionmangle rules are
|
||||
considered.
|
||||
--download-current-version
|
||||
Download the currently packaged version
|
||||
--check-dirname-level N
|
||||
Check parent directory name?
|
||||
N=0 never check parent directory name
|
||||
N=1 only when $progname changes directory (default)
|
||||
N=2 always check parent directory name
|
||||
--check-dirname-regex REGEX
|
||||
What constitutes a matching directory name; REGEX is
|
||||
a Perl regular expression; the string \`PACKAGE\' will
|
||||
be replaced by the package name; see manpage for details
|
||||
(default: 'PACKAGE(-.+)?')
|
||||
--destdir Path of directory to which to download.
|
||||
--package PACKAGE
|
||||
Specify the package name rather than examining
|
||||
debian/changelog; must use --upstream-version and
|
||||
--watchfile with this option, no directory traversing
|
||||
will be performed, no actions (even downloading) will be
|
||||
carried out
|
||||
--upstream-version VERSION
|
||||
Specify the current upstream version in use rather than
|
||||
parsing debian/changelog to determine this
|
||||
--watchfile FILE
|
||||
Specify the watch file rather than using debian/watch;
|
||||
no directory traversing will be done in this case
|
||||
--bare Disable all site specific special case codes to perform URL
|
||||
redirections and page content alterations.
|
||||
--no-exclusion Disable automatic exclusion of files mentioned in
|
||||
debian/copyright field Files-Excluded and Files-Excluded-*
|
||||
--pasv Use PASV mode for FTP connections
|
||||
--no-pasv Don\'t use PASV mode for FTP connections (default)
|
||||
--no-symlink Don\'t rename nor repack upstream tarball
|
||||
--timeout N Specifies how much time, in seconds, we give remote
|
||||
servers to respond (default 20 seconds)
|
||||
--user-agent, --useragent
|
||||
Override the default user agent string
|
||||
--log Record md5sum changes of repackaging
|
||||
--help Show this message
|
||||
--version Show version information
|
||||
|
||||
Options passed on to mk-origtargz:
|
||||
--symlink Create a correctly named symlink to downloaded file (default)
|
||||
--rename Rename instead of symlinking
|
||||
--copy Copy instead of symlinking
|
||||
--repack Repack downloaded archives to change compression
|
||||
--compression [ gzip | bzip2 | lzma | xz ]
|
||||
When the upstream sources are repacked, use compression COMP
|
||||
for the resulting tarball (default: gzip)
|
||||
--copyright-file FILE
|
||||
Remove files matching the patterns found in FILE
|
||||
|
||||
Default settings modified by devscripts configuration files:
|
||||
$self->{modified_conf_msg}
|
||||
EOF
|
||||
}
|
||||
|
||||
sub version {
|
||||
print <<"EOF";
|
||||
This is $progname, from the Debian devscripts package, version $main::uscan_version
|
||||
This code is copyright 1999-2006 by Julian Gilbey and 2018 by Xavier Guimard,
|
||||
all rights reserved.
|
||||
Original code by Christoph Lameter.
|
||||
This program comes with ABSOLUTELY NO WARRANTY.
|
||||
You are free to redistribute this code under the terms of the
|
||||
GNU General Public License, version 2 or later.
|
||||
EOF
|
||||
}
|
||||
|
||||
1;
|
||||
__END__
|
||||
=head1 SEE ALSO
|
||||
|
||||
L<uscan>, L<Devscripts::Config>
|
||||
|
||||
=head1 AUTHOR
|
||||
|
||||
B<uscan> was originally written by Christoph Lameter
|
||||
E<lt>clameter@debian.orgE<gt> (I believe), modified by Julian Gilbey
|
||||
E<lt>jdg@debian.orgE<gt>. HTTP support was added by Piotr Roszatycki
|
||||
E<lt>dexter@debian.orgE<gt>. B<uscan> was rewritten in Perl by Julian Gilbey.
|
||||
Xavier Guimard E<lt>yadd@debian.orgE<gt> rewrote uscan in object
|
||||
oriented Perl.
|
||||
|
||||
=head1 COPYRIGHT AND LICENSE
|
||||
|
||||
Copyright 2002-2006 by Julian Gilbey <jdg@debian.org>,
|
||||
2018 by Xavier Guimard <yadd@debian.org>
|
||||
|
||||
This program is free software; you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation; either version 2 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
=cut
|
36
lib/Devscripts/Uscan/Ctype/nodejs.pm
Normal file
36
lib/Devscripts/Uscan/Ctype/nodejs.pm
Normal file
|
@ -0,0 +1,36 @@
|
|||
package Devscripts::Uscan::Ctype::nodejs;
|
||||
|
||||
use strict;
|
||||
|
||||
use Moo;
|
||||
use JSON;
|
||||
use Devscripts::Uscan::Output;
|
||||
|
||||
has dir => (is => 'ro');
|
||||
has pkg => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub {
|
||||
$_[0]->{dir} . '/package.json';
|
||||
});
|
||||
|
||||
sub version {
|
||||
my ($self) = @_;
|
||||
return unless $self->dir and -d $self->dir;
|
||||
unless (-r $self->pkg) {
|
||||
uscan_warn "Unable to read $self->{pkg}, skipping current version";
|
||||
return;
|
||||
}
|
||||
my ($version, $content);
|
||||
{
|
||||
local $/ = undef;
|
||||
open my $f, $self->pkg;
|
||||
$content = <$f>;
|
||||
close $f;
|
||||
}
|
||||
eval { $version = decode_json($content)->{version}; };
|
||||
uscan_warn $@ if $@;
|
||||
return $version;
|
||||
}
|
||||
|
||||
1;
|
36
lib/Devscripts/Uscan/Ctype/perl.pm
Normal file
36
lib/Devscripts/Uscan/Ctype/perl.pm
Normal file
|
@ -0,0 +1,36 @@
|
|||
package Devscripts::Uscan::Ctype::perl;
|
||||
|
||||
use strict;
|
||||
|
||||
use Moo;
|
||||
use JSON;
|
||||
use Devscripts::Uscan::Output;
|
||||
|
||||
has dir => (is => 'ro');
|
||||
has pkg => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub {
|
||||
$_[0]->{dir} . '/META.json';
|
||||
});
|
||||
|
||||
sub version {
|
||||
my ($self) = @_;
|
||||
return unless $self->dir and -d $self->dir;
|
||||
unless (-r $self->pkg) {
|
||||
uscan_warn "Unable to read $self->{pkg}, skipping current version";
|
||||
return;
|
||||
}
|
||||
my ($version, $content);
|
||||
{
|
||||
local $/ = undef;
|
||||
open my $f, $self->pkg;
|
||||
$content = <$f>;
|
||||
close $f;
|
||||
}
|
||||
eval { $version = decode_json($content)->{version}; };
|
||||
uscan_warn $@ if $@;
|
||||
return $version;
|
||||
}
|
||||
|
||||
1;
|
346
lib/Devscripts/Uscan/Downloader.pm
Normal file
346
lib/Devscripts/Uscan/Downloader.pm
Normal file
|
@ -0,0 +1,346 @@
|
|||
package Devscripts::Uscan::Downloader;
|
||||
|
||||
use strict;
|
||||
use Cwd qw/cwd abs_path/;
|
||||
use Devscripts::Uscan::CatchRedirections;
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Uscan::Utils;
|
||||
use Dpkg::IPC;
|
||||
use File::DirList;
|
||||
use File::Find;
|
||||
use File::Temp qw/tempdir/;
|
||||
use File::Touch;
|
||||
use Moo;
|
||||
use URI;
|
||||
|
||||
our $haveSSL;
|
||||
|
||||
has git_upstream => (is => 'rw');
|
||||
|
||||
BEGIN {
|
||||
eval { require LWP::UserAgent; };
|
||||
if ($@) {
|
||||
my $progname = basename($0);
|
||||
if ($@ =~ /^Can\'t locate LWP\/UserAgent\.pm/) {
|
||||
die "$progname: you must have the libwww-perl package installed\n"
|
||||
. "to use this script";
|
||||
} else {
|
||||
die "$progname: problem loading the LWP::UserAgent module:\n $@\n"
|
||||
. "Have you installed the libwww-perl package?";
|
||||
}
|
||||
}
|
||||
eval { require LWP::Protocol::https; };
|
||||
$haveSSL = $@ ? 0 : 1;
|
||||
}
|
||||
|
||||
has agent =>
|
||||
(is => 'rw', default => sub { "Debian uscan $main::uscan_version" });
|
||||
has timeout => (is => 'rw');
|
||||
has pasv => (
|
||||
is => 'rw',
|
||||
default => 'default',
|
||||
trigger => sub {
|
||||
my ($self, $nv) = @_;
|
||||
if ($nv) {
|
||||
uscan_verbose "Set passive mode: $self->{pasv}";
|
||||
$ENV{'FTP_PASSIVE'} = $self->pasv;
|
||||
} elsif ($ENV{'FTP_PASSIVE'}) {
|
||||
uscan_verbose "Unset passive mode";
|
||||
delete $ENV{'FTP_PASSIVE'};
|
||||
}
|
||||
});
|
||||
has destdir => (is => 'rw');
|
||||
|
||||
# 0: no repo, 1: shallow clone, 2: full clone
|
||||
has gitrepo_state => (
|
||||
is => 'rw',
|
||||
default => sub { 0 });
|
||||
has git_export_all => (
|
||||
is => 'rw',
|
||||
default => sub { 0 });
|
||||
has user_agent => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub {
|
||||
my ($self) = @_;
|
||||
my $user_agent
|
||||
= Devscripts::Uscan::CatchRedirections->new(env_proxy => 1);
|
||||
$user_agent->timeout($self->timeout);
|
||||
$user_agent->agent($self->agent);
|
||||
|
||||
# Strip Referer header for Sourceforge to avoid SF sending back a
|
||||
# "200 OK" with a <meta refresh=...> redirect
|
||||
$user_agent->add_handler(
|
||||
'request_prepare' => sub {
|
||||
my ($request, $ua, $h) = @_;
|
||||
$request->remove_header('Referer');
|
||||
},
|
||||
m_hostname => 'sourceforge.net',
|
||||
);
|
||||
$self->{user_agent} = $user_agent;
|
||||
});
|
||||
|
||||
has ssl => (is => 'rw', default => sub { $haveSSL });
|
||||
|
||||
has headers => (
|
||||
is => 'ro',
|
||||
default => sub { {} });
|
||||
|
||||
sub download ($$$$$$$$) {
|
||||
my (
|
||||
$self, $url, $fname, $optref, $base,
|
||||
$pkg_dir, $pkg, $mode, $gitrepo_dir
|
||||
) = @_;
|
||||
my ($request, $response);
|
||||
$mode ||= $optref->mode;
|
||||
if ($mode eq 'http') {
|
||||
if ($url =~ /^https/ and !$self->ssl) {
|
||||
uscan_die "$progname: you must have the "
|
||||
. "liblwp-protocol-https-perl package installed\n"
|
||||
. "to use https URLs";
|
||||
}
|
||||
|
||||
# substitute HTML entities
|
||||
# Is anything else than "&" required? I doubt it.
|
||||
uscan_verbose "Requesting URL:\n $url";
|
||||
my $headers = HTTP::Headers->new;
|
||||
$headers->header('Accept' => '*/*');
|
||||
$headers->header('Referer' => $base);
|
||||
my $uri_o = URI->new($url);
|
||||
foreach my $k (keys %{ $self->headers }) {
|
||||
if ($k =~ /^(.*?)@(.*)$/) {
|
||||
my $baseUrl = $1;
|
||||
my $hdr = $2;
|
||||
if ($url =~ m#^\Q$baseUrl\E(?:/.*)?$#) {
|
||||
$headers->header($hdr => $self->headers->{$k});
|
||||
uscan_verbose "Set per-host custom header $hdr for $url";
|
||||
} else {
|
||||
uscan_debug "$url does not start with $1";
|
||||
}
|
||||
} else {
|
||||
uscan_warn "Malformed http-header: $k";
|
||||
}
|
||||
}
|
||||
$request = HTTP::Request->new('GET', $url, $headers);
|
||||
$response = $self->user_agent->request($request, $fname);
|
||||
if (!$response->is_success) {
|
||||
uscan_warn((defined $pkg_dir ? "In directory $pkg_dir, d" : "D")
|
||||
. "ownloading\n $url failed: "
|
||||
. $response->status_line);
|
||||
return 0;
|
||||
}
|
||||
} elsif ($mode eq 'ftp') {
|
||||
uscan_verbose "Requesting URL:\n $url";
|
||||
$request = HTTP::Request->new('GET', "$url");
|
||||
$response = $self->user_agent->request($request, $fname);
|
||||
if (!$response->is_success) {
|
||||
uscan_warn(
|
||||
(defined $pkg_dir ? "In directory $pkg_dir, d" : "D")
|
||||
. "ownloading\n $url failed: "
|
||||
. $response->status_line);
|
||||
return 0;
|
||||
}
|
||||
} else { # elsif ($$optref{'mode'} eq 'git')
|
||||
my $destdir = $self->destdir;
|
||||
my $curdir = cwd();
|
||||
$fname =~ m%(.*)/$pkg-([^_/]*)\.tar(?:\.(gz|xz|bz2|lzma|zstd?))?%;
|
||||
my $dst = $1;
|
||||
my $abs_dst = abs_path($dst);
|
||||
my $ver = $2;
|
||||
my $suffix = $3;
|
||||
my ($gitrepo, $gitref) = split /[[:space:]]+/, $url, 2;
|
||||
my $clean = sub {
|
||||
uscan_exec_no_fail('rm', '-fr', $gitrepo_dir);
|
||||
};
|
||||
my $clean_and_die = sub {
|
||||
$clean->();
|
||||
uscan_die @_;
|
||||
};
|
||||
|
||||
if ($mode eq 'svn') {
|
||||
my $tempdir = tempdir(CLEANUP => 1);
|
||||
my $old_umask = umask(oct('022'));
|
||||
uscan_exec('svn', 'export', $url, "$tempdir/$pkg-$ver");
|
||||
umask($old_umask);
|
||||
find({
|
||||
wanted => sub {
|
||||
return if !-d $File::Find::name;
|
||||
my ($newest) = grep { $_ ne '.' && $_ ne '..' }
|
||||
map { $_->[13] } @{ File::DirList::list($_, 'M') };
|
||||
return if !$newest;
|
||||
my $touch
|
||||
= File::Touch->new(reference => $_ . '/' . $newest);
|
||||
$touch->touch($_);
|
||||
},
|
||||
bydepth => 1,
|
||||
no_chdir => 1,
|
||||
},
|
||||
"$tempdir/$pkg-$ver"
|
||||
);
|
||||
uscan_exec(
|
||||
'tar', '-C',
|
||||
$tempdir, '--sort=name',
|
||||
'--owner=root', '--group=root',
|
||||
'-cvf', "$abs_dst/$pkg-$ver.tar",
|
||||
"$pkg-$ver"
|
||||
);
|
||||
} elsif ($self->git_upstream) {
|
||||
my ($infodir, $attr_file, $attr_bkp);
|
||||
if ($self->git_export_all) {
|
||||
# override any export-subst and export-ignore attributes
|
||||
spawn(
|
||||
exec => [qw|git rev-parse --git-path info/|],
|
||||
to_string => \$infodir,
|
||||
);
|
||||
chomp $infodir;
|
||||
mkdir $infodir unless -e $infodir;
|
||||
spawn(
|
||||
exec => [qw|git rev-parse --git-path info/attributes|],
|
||||
to_string => \$attr_file,
|
||||
);
|
||||
chomp $attr_file;
|
||||
spawn(
|
||||
exec =>
|
||||
[qw|git rev-parse --git-path info/attributes-uscan|],
|
||||
to_string => \$attr_bkp,
|
||||
);
|
||||
chomp $attr_bkp;
|
||||
rename $attr_file, $attr_bkp if -e $attr_file;
|
||||
my $attr_fh;
|
||||
|
||||
unless (open($attr_fh, '>', $attr_file)) {
|
||||
rename $attr_bkp, $attr_file if -e $attr_bkp;
|
||||
uscan_die("could not open $attr_file for writing");
|
||||
}
|
||||
print $attr_fh "* -export-subst\n* -export-ignore\n";
|
||||
close $attr_fh;
|
||||
}
|
||||
|
||||
uscan_exec_no_fail('git', 'archive', '--format=tar',
|
||||
"--prefix=$pkg-$ver/", "--output=$abs_dst/$pkg-$ver.tar",
|
||||
$gitref) == 0
|
||||
or $clean_and_die->("git archive failed");
|
||||
|
||||
if ($self->git_export_all) {
|
||||
# restore attributes
|
||||
if (-e $attr_bkp) {
|
||||
rename $attr_bkp, $attr_file;
|
||||
} else {
|
||||
unlink $attr_file;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
if ($self->gitrepo_state == 0) {
|
||||
my @opts = ();
|
||||
if ($optref->git->{modules}) {
|
||||
foreach my $m (@{ $optref->git->{modules} }) {
|
||||
push(@opts, "--recurse-submodules=$m");
|
||||
}
|
||||
} else {
|
||||
push(@opts, '--bare');
|
||||
}
|
||||
$self->gitrepo_state(2);
|
||||
if ($optref->git->{mode} eq 'shallow') {
|
||||
my $tag = $gitref;
|
||||
$tag =~ s#^refs/(?:tags|heads)/##;
|
||||
|
||||
if ($optref->git->{modules}) {
|
||||
push(@opts, '--shallow-submodules');
|
||||
}
|
||||
push(@opts, '--depth=1', '-b', $tag);
|
||||
$self->gitrepo_state(1);
|
||||
}
|
||||
uscan_exec('git', 'clone', @opts, $base,
|
||||
"$destdir/$gitrepo_dir");
|
||||
}
|
||||
|
||||
chdir "$destdir/$gitrepo_dir"
|
||||
or
|
||||
$clean_and_die->("Unable to chdir($destdir/$gitrepo_dir): $!");
|
||||
|
||||
if ($self->git_export_all) {
|
||||
my (@info_dirs, @attr_files);
|
||||
my @arr_refs = (\@info_dirs, \@attr_files);
|
||||
my @gitpaths = ("info/", "info/attributes");
|
||||
|
||||
for (my $tmp, my $i = 0 ; $i < @gitpaths ; $i++) {
|
||||
my @cmd
|
||||
= ("git", "rev-parse", "--git-path", ${ gitpaths [$i] });
|
||||
spawn(
|
||||
exec => [@cmd],
|
||||
to_string => \$tmp,
|
||||
);
|
||||
chomp $tmp;
|
||||
push(@{ $arr_refs[$i] }, split(/\n/, $tmp));
|
||||
|
||||
if ($optref->git->{modules}) {
|
||||
spawn(
|
||||
exec =>
|
||||
['git', 'submodule', '--quiet', 'foreach', @cmd],
|
||||
to_string => \$tmp,
|
||||
);
|
||||
chomp $tmp;
|
||||
push(@{ $arr_refs[$i] }, split(/\n/, $tmp));
|
||||
}
|
||||
}
|
||||
|
||||
foreach my $infodir (@info_dirs) {
|
||||
mkdir $infodir unless -e $infodir;
|
||||
}
|
||||
|
||||
# override any export-subst and export-ignore attributes
|
||||
foreach my $attr_file (@attr_files) {
|
||||
my $attr_fh;
|
||||
open($attr_fh, '>', $attr_file);
|
||||
print $attr_fh "* -export-subst\n* -export-ignore\n";
|
||||
close $attr_fh;
|
||||
}
|
||||
}
|
||||
|
||||
# archive main repository
|
||||
uscan_exec_no_fail('git', 'archive', '--format=tar',
|
||||
"--prefix=$pkg-$ver/",
|
||||
"--output=$abs_dst/$pkg-$ver.tar", $gitref) == 0
|
||||
or $clean_and_die->("$gitrepo_dir", "git archive failed");
|
||||
|
||||
# archive submodules, append to main tarball, clean up
|
||||
if ($optref->git->{modules}) {
|
||||
my $cmd = join ' ',
|
||||
"git archive --format=tar --prefix=$pkg-$ver/\$sm_path/",
|
||||
"--output=$abs_dst/\$sha1.tar HEAD",
|
||||
"&& tar -Af $abs_dst/$pkg-$ver.tar $abs_dst/\$sha1.tar",
|
||||
"&& rm $abs_dst/\$sha1.tar";
|
||||
uscan_exec_no_fail('git', 'submodule', '--quiet', 'foreach',
|
||||
$cmd) == 0
|
||||
or $clean_and_die->("git archive (submodules) failed");
|
||||
}
|
||||
|
||||
chdir "$curdir"
|
||||
or $clean_and_die->("Unable to chdir($curdir): $!");
|
||||
}
|
||||
|
||||
if (defined($suffix)) {
|
||||
chdir "$abs_dst"
|
||||
or $clean_and_die->("Unable to chdir($abs_dst): $!");
|
||||
if ($suffix eq 'gz') {
|
||||
uscan_exec("gzip", "-n", "-9", "$pkg-$ver.tar");
|
||||
} elsif ($suffix eq 'xz') {
|
||||
uscan_exec("xz", "$pkg-$ver.tar");
|
||||
} elsif ($suffix eq 'bz2') {
|
||||
uscan_exec("bzip2", "$pkg-$ver.tar");
|
||||
} elsif ($suffix eq 'lzma') {
|
||||
uscan_exec("lzma", "$pkg-$ver.tar");
|
||||
#} elsif ($suffix =~ /^zstd?$/) {
|
||||
# uscan_exec("zstd", "$pkg-$ver.tar");
|
||||
} else {
|
||||
$clean_and_die->("Unknown suffix file to repack: $suffix");
|
||||
}
|
||||
chdir "$curdir"
|
||||
or $clean_and_die->("Unable to chdir($curdir): $!");
|
||||
}
|
||||
$clean->();
|
||||
}
|
||||
return 1;
|
||||
}
|
||||
|
||||
1;
|
257
lib/Devscripts/Uscan/FindFiles.pm
Normal file
257
lib/Devscripts/Uscan/FindFiles.pm
Normal file
|
@ -0,0 +1,257 @@
|
|||
|
||||
=head1 NAME
|
||||
|
||||
Devscripts::Uscan::FindFiles - watchfile finder
|
||||
|
||||
=head1 SYNOPSIS
|
||||
|
||||
use Devscripts::Uscan::Config;
|
||||
use Devscripts::Uscan::FindFiles;
|
||||
|
||||
# Get config
|
||||
my $config = Devscripts::Uscan::Config->new->parse;
|
||||
|
||||
# Search watchfiles
|
||||
my @wf = find_watch_files($config);
|
||||
|
||||
=head1 DESCRIPTION
|
||||
|
||||
This package exports B<find_watch_files()> function. This function search
|
||||
Debian watchfiles following configuration parameters.
|
||||
|
||||
=head1 SEE ALSO
|
||||
|
||||
L<uscan>, L<Devscripts::Uscan::WatchFile>, L<Devscripts::Uscan::Config>
|
||||
|
||||
=head1 AUTHOR
|
||||
|
||||
B<uscan> was originally written by Christoph Lameter
|
||||
E<lt>clameter@debian.orgE<gt> (I believe), modified by Julian Gilbey
|
||||
E<lt>jdg@debian.orgE<gt>. HTTP support was added by Piotr Roszatycki
|
||||
E<lt>dexter@debian.orgE<gt>. B<uscan> was rewritten in Perl by Julian Gilbey.
|
||||
Xavier Guimard E<lt>yadd@debian.orgE<gt> rewrote uscan in object
|
||||
oriented Perl.
|
||||
|
||||
=head1 COPYRIGHT AND LICENSE
|
||||
|
||||
Copyright 2002-2006 by Julian Gilbey <jdg@debian.org>,
|
||||
2018 by Xavier Guimard <yadd@debian.org>
|
||||
|
||||
This program is free software; you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation; either version 2 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
=cut
|
||||
|
||||
package Devscripts::Uscan::FindFiles;
|
||||
|
||||
use strict;
|
||||
use filetest 'access';
|
||||
use Cwd qw/cwd/;
|
||||
use Exporter 'import';
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Versort;
|
||||
use Dpkg::Changelog::Parse qw(changelog_parse);
|
||||
use File::Basename;
|
||||
|
||||
our @EXPORT = ('find_watch_files');
|
||||
|
||||
sub find_watch_files {
|
||||
my ($config) = @_;
|
||||
my $opwd = cwd();
|
||||
|
||||
# when --watchfile is used
|
||||
if (defined $config->watchfile) {
|
||||
uscan_verbose "Option --watchfile=$config->{watchfile} used";
|
||||
my ($config) = (@_);
|
||||
|
||||
# no directory traversing then, and things are very simple
|
||||
if (defined $config->package) {
|
||||
|
||||
# no need to even look for a changelog!
|
||||
return (
|
||||
['.', $config->package, $config->uversion, $config->watchfile]
|
||||
);
|
||||
} else {
|
||||
# Check for debian/changelog file
|
||||
until (-r 'debian/changelog') {
|
||||
chdir '..' or uscan_die "can't chdir ..: $!";
|
||||
if (cwd() eq '/') {
|
||||
uscan_die "Are you in the source code tree?\n"
|
||||
. " Cannot find readable debian/changelog anywhere!";
|
||||
}
|
||||
}
|
||||
|
||||
my ($package, $debversion, $uversion)
|
||||
= scan_changelog($config, $opwd, 1);
|
||||
|
||||
return ([cwd(), $package, $uversion, $config->watchfile]);
|
||||
}
|
||||
}
|
||||
|
||||
# when --watchfile is not used, scan watch files
|
||||
push @ARGV, '.' if !@ARGV;
|
||||
{
|
||||
local $, = ',';
|
||||
uscan_verbose "Scan watch files in @ARGV";
|
||||
}
|
||||
|
||||
# Run find to find the directories. We will handle filenames with spaces
|
||||
# correctly, which makes this code a little messier than it would be
|
||||
# otherwise.
|
||||
my @dirs;
|
||||
open FIND, '-|', 'find', '-L', @ARGV,
|
||||
qw{-type d ( -name .git -prune -o -name debian -print ) }
|
||||
or uscan_die "Couldn't exec find: $!";
|
||||
|
||||
while (<FIND>) {
|
||||
chomp;
|
||||
push @dirs, $_;
|
||||
uscan_debug "Found $_";
|
||||
}
|
||||
close FIND;
|
||||
|
||||
uscan_die "No debian directories found" unless @dirs;
|
||||
|
||||
my @debdirs = ();
|
||||
|
||||
my $origdir = cwd;
|
||||
for my $dir (@dirs) {
|
||||
$dir =~ s%/debian$%%;
|
||||
|
||||
unless (chdir $origdir) {
|
||||
uscan_warn "Couldn't chdir back to $origdir, skipping: $!";
|
||||
next;
|
||||
}
|
||||
unless (chdir $dir) {
|
||||
uscan_warn "Couldn't chdir $dir, skipping: $!";
|
||||
next;
|
||||
}
|
||||
|
||||
uscan_verbose "Check debian/watch and debian/changelog in $dir";
|
||||
|
||||
# Check for debian/watch file
|
||||
if (-r 'debian/watch') {
|
||||
unless (-r 'debian/changelog') {
|
||||
uscan_warn
|
||||
"Problems reading debian/changelog in $dir, skipping";
|
||||
next;
|
||||
}
|
||||
my ($package, $debversion, $uversion)
|
||||
= scan_changelog($config, $opwd);
|
||||
next unless ($package);
|
||||
|
||||
uscan_verbose
|
||||
"package=\"$package\" version=\"$uversion\" (no epoch/revision)";
|
||||
push @debdirs, [$debversion, $dir, $package, $uversion];
|
||||
}
|
||||
}
|
||||
|
||||
uscan_warn "No watch file found" unless @debdirs;
|
||||
|
||||
# Was there a --upstream-version option?
|
||||
if (defined $config->uversion) {
|
||||
if (@debdirs == 1) {
|
||||
$debdirs[0][3] = $config->uversion;
|
||||
} else {
|
||||
uscan_warn
|
||||
"ignoring --upstream-version as more than one debian/watch file found";
|
||||
}
|
||||
}
|
||||
|
||||
# Now sort the list of directories, so that we process the most recent
|
||||
# directories first, as determined by the package version numbers
|
||||
@debdirs = Devscripts::Versort::deb_versort(@debdirs);
|
||||
|
||||
# Now process the watch files in order. If a directory d has
|
||||
# subdirectories d/sd1/debian and d/sd2/debian, which each contain watch
|
||||
# files corresponding to the same package, then we only process the watch
|
||||
# file in the package with the latest version number.
|
||||
my %donepkgs;
|
||||
my @results;
|
||||
for my $debdir (@debdirs) {
|
||||
shift @$debdir; # don't need the Debian version number any longer
|
||||
my $dir = $$debdir[0];
|
||||
my $parentdir = dirname($dir);
|
||||
my $package = $$debdir[1];
|
||||
my $version = $$debdir[2];
|
||||
|
||||
if (exists $donepkgs{$parentdir}{$package}) {
|
||||
uscan_warn
|
||||
"Skipping $dir/debian/watch\n as this package has already been found";
|
||||
next;
|
||||
}
|
||||
|
||||
unless (chdir $origdir) {
|
||||
uscan_warn "Couldn't chdir back to $origdir, skipping: $!";
|
||||
next;
|
||||
}
|
||||
unless (chdir $dir) {
|
||||
uscan_warn "Couldn't chdir $dir, skipping: $!";
|
||||
next;
|
||||
}
|
||||
|
||||
uscan_verbose
|
||||
"$dir/debian/changelog sets package=\"$package\" version=\"$version\"";
|
||||
push @results, [$dir, $package, $version, "debian/watch", cwd];
|
||||
}
|
||||
unless (chdir $origdir) {
|
||||
uscan_die "Couldn't chdir back to $origdir! $!";
|
||||
}
|
||||
return @results;
|
||||
}
|
||||
|
||||
sub scan_changelog {
|
||||
my ($config, $opwd, $die) = @_;
|
||||
my $out
|
||||
= $die
|
||||
? sub { uscan_die(@_) }
|
||||
: sub { uscan_warn($_[0] . ', skipping'); return undef; };
|
||||
|
||||
# Figure out package info we need
|
||||
my $changelog = eval { changelog_parse(); };
|
||||
if ($@) {
|
||||
return $out->("Problems parsing debian/changelog");
|
||||
}
|
||||
|
||||
my ($package, $debversion, $uversion);
|
||||
$package = $changelog->{Source};
|
||||
return $out->("Problem determining the package name from debian/changelog")
|
||||
unless defined $package;
|
||||
$debversion = $changelog->{Version};
|
||||
return $out->("Problem determining the version from debian/changelog")
|
||||
unless defined $debversion;
|
||||
uscan_verbose
|
||||
"package=\"$package\" version=\"$debversion\" (as seen in debian/changelog)";
|
||||
|
||||
# Check the directory is properly named for safety
|
||||
if ($config->check_dirname_level == 2
|
||||
or ($config->check_dirname_level == 1 and cwd() ne $opwd)) {
|
||||
my $good_dirname;
|
||||
my $re = $config->check_dirname_regex;
|
||||
$re =~ s/PACKAGE/\Q$package\E/g;
|
||||
if ($re =~ m%/%) {
|
||||
$good_dirname = (cwd() =~ m%^$re$%);
|
||||
} else {
|
||||
$good_dirname = (basename(cwd()) =~ m%^$re$%);
|
||||
}
|
||||
return $out->("The directory name "
|
||||
. basename(cwd())
|
||||
. " doesn't match the requirement of\n"
|
||||
. " --check-dirname-level=$config->{check_dirname_level} --check-dirname-regex=$re .\n"
|
||||
. " Set --check-dirname-level=0 to disable this sanity check feature."
|
||||
) unless $good_dirname;
|
||||
}
|
||||
|
||||
# Get current upstream version number
|
||||
if (defined $config->uversion) {
|
||||
$uversion = $config->uversion;
|
||||
} else {
|
||||
$uversion = $debversion;
|
||||
$uversion =~ s/-[^-]+$//; # revision
|
||||
$uversion =~ s/^\d+://; # epoch
|
||||
}
|
||||
return ($package, $debversion, $uversion);
|
||||
}
|
||||
1;
|
317
lib/Devscripts/Uscan/Keyring.pm
Normal file
317
lib/Devscripts/Uscan/Keyring.pm
Normal file
|
@ -0,0 +1,317 @@
|
|||
package Devscripts::Uscan::Keyring;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Uscan::Utils;
|
||||
use Dpkg::IPC;
|
||||
use Dpkg::Path qw/find_command/;
|
||||
use File::Copy qw/copy move/;
|
||||
use File::Path qw/make_path remove_tree/;
|
||||
use File::Temp qw/tempfile tempdir/;
|
||||
use List::Util qw/first/;
|
||||
use MIME::Base64;
|
||||
|
||||
# _pgp_* functions are strictly for applying or removing ASCII armor.
|
||||
# see https://www.rfc-editor.org/rfc/rfc9580.html#section-6 for more
|
||||
# details.
|
||||
|
||||
# Note that these _pgp_* functions are only necessary while relying on
|
||||
# gpgv, and gpgv itself does not verify multiple signatures correctly
|
||||
# (see https://bugs.debian.org/1010955)
|
||||
|
||||
sub _pgp_unarmor_data {
|
||||
my ($type, $data, $filename) = @_;
|
||||
# note that we ignore an incorrect or absent checksum, following the
|
||||
# guidance of
|
||||
# https://www.rfc-editor.org/rfc/rfc9580.html#section-6.1-3
|
||||
|
||||
my $armor_regex = qr{
|
||||
-----BEGIN\ PGP\ \Q$type\E-----[\r\t ]*\n
|
||||
(?:[^:\n]+:\ [^\n]*[\r\t ]*\n)*
|
||||
[\r\t ]*\n
|
||||
([a-zA-Z0-9/+\n]+={0,2})[\r\t ]*\n
|
||||
(?:=[a-zA-Z0-9/+]{4}[\r\t ]*\n)?
|
||||
-----END\ PGP\ \Q$type\E-----
|
||||
}xm;
|
||||
|
||||
my $blocks = 0;
|
||||
my $binary;
|
||||
while ($data =~ m/$armor_regex/g) {
|
||||
$binary .= decode_base64($1);
|
||||
$blocks++;
|
||||
}
|
||||
if ($blocks > 1) {
|
||||
uscan_warn "Found multiple concatenated ASCII Armor blocks in\n"
|
||||
. " $filename, which is not an interoperable construct.\n"
|
||||
. " See <https://tests.sequoia-pgp.org/results.html#ASCII_Armor>.\n"
|
||||
. " Please concatenate them into a single ASCII Armor block. For example:\n"
|
||||
. " sq keyring merge --overwrite --output $filename \\\n"
|
||||
. " $filename";
|
||||
}
|
||||
return $binary;
|
||||
}
|
||||
|
||||
sub _pgp_armor_checksum {
|
||||
my ($data) = @_;
|
||||
# from https://www.rfc-editor.org/rfc/rfc9580.html#section-6.1.1
|
||||
#
|
||||
# #define CRC24_INIT 0xB704CEL
|
||||
# #define CRC24_GENERATOR 0x864CFBL
|
||||
|
||||
# typedef unsigned long crc24;
|
||||
# crc24 crc_octets(unsigned char *octets, size_t len)
|
||||
# {
|
||||
# crc24 crc = CRC24_INIT;
|
||||
# int i;
|
||||
# while (len--) {
|
||||
# crc ^= (*octets++) << 16;
|
||||
# for (i = 0; i < 8; i++) {
|
||||
# crc <<= 1;
|
||||
# if (crc & 0x1000000) {
|
||||
# crc &= 0xffffff; /* Clear bit 25 to avoid overflow */
|
||||
# crc ^= CRC24_GENERATOR;
|
||||
# }
|
||||
# }
|
||||
# }
|
||||
# return crc & 0xFFFFFFL;
|
||||
# }
|
||||
#
|
||||
# the resulting three-octet-wide value then gets base64-encoded into
|
||||
# four base64 ASCII characters.
|
||||
|
||||
my $CRC24_INIT = 0xB704CE;
|
||||
my $CRC24_GENERATOR = 0x864CFB;
|
||||
|
||||
my @bytes = unpack 'C*', $data;
|
||||
my $crc = $CRC24_INIT;
|
||||
for my $b (@bytes) {
|
||||
$crc ^= ($b << 16);
|
||||
for (1 .. 8) {
|
||||
$crc <<= 1;
|
||||
if ($crc & 0x1000000) {
|
||||
$crc &= 0xffffff; # Clear bit 25 to avoid overflow
|
||||
$crc ^= $CRC24_GENERATOR;
|
||||
}
|
||||
}
|
||||
}
|
||||
my $sum
|
||||
= pack('CCC', (($crc >> 16) & 0xff, ($crc >> 8) & 0xff, $crc & 0xff));
|
||||
return encode_base64($sum, q{});
|
||||
}
|
||||
|
||||
sub _pgp_armor_data {
|
||||
my ($type, $data) = @_;
|
||||
my $out = encode_base64($data, q{}) =~ s/(.{1,64})/$1\n/gr;
|
||||
chomp $out;
|
||||
my $crc = _pgp_armor_checksum($data);
|
||||
my $armor = <<~"ARMOR";
|
||||
-----BEGIN PGP $type-----
|
||||
|
||||
$out
|
||||
=$crc
|
||||
-----END PGP $type-----
|
||||
ARMOR
|
||||
return $armor;
|
||||
}
|
||||
|
||||
sub new {
|
||||
my ($class) = @_;
|
||||
my $keyring;
|
||||
my $havegpgv = first { find_command($_) } qw(gpgv);
|
||||
my $havesopv = first { find_command($_) } qw(sopv);
|
||||
my $havesop
|
||||
= first { find_command($_) } qw(sqop rsop pgpainless-cli gosop);
|
||||
uscan_die("Please install a sopv variant.")
|
||||
unless (defined $havegpgv or defined $havesopv);
|
||||
|
||||
# upstream/signing-key.pgp and upstream-signing-key.pgp are deprecated
|
||||
# but supported
|
||||
if (-r "debian/upstream/signing-key.asc") {
|
||||
$keyring = "debian/upstream/signing-key.asc";
|
||||
} else {
|
||||
my $binkeyring = first { -r $_ } qw(
|
||||
debian/upstream/signing-key.pgp
|
||||
debian/upstream-signing-key.pgp
|
||||
);
|
||||
if (defined $binkeyring) {
|
||||
make_path('debian/upstream', { mode => 0700, verbose => 'true' });
|
||||
|
||||
# convert to the policy complying armored key
|
||||
uscan_verbose(
|
||||
"Found upstream binary signing keyring: $binkeyring");
|
||||
|
||||
# Need to convert to an armored key
|
||||
$keyring = "debian/upstream/signing-key.asc";
|
||||
uscan_warn "Found deprecated binary keyring ($binkeyring). "
|
||||
. "Please save it in armored format in $keyring. For example:\n"
|
||||
. " sop armor < $binkeyring > $keyring";
|
||||
if ($havesop) {
|
||||
spawn(
|
||||
exec => [$havesop, 'armor'],
|
||||
from_file => $binkeyring,
|
||||
to_file => $keyring,
|
||||
wait_child => 1,
|
||||
);
|
||||
} else {
|
||||
open my $inkeyring, '<', $binkeyring
|
||||
or uscan_warn(
|
||||
"Can't open $binkeyring to read deprecated binary keyring"
|
||||
);
|
||||
read $inkeyring, my $keycontent, -s $inkeyring;
|
||||
close $inkeyring;
|
||||
open my $outkeyring, '>', $keyring
|
||||
or uscan_warn(
|
||||
"Can't open $keyring for writing ASCII-armored keyring");
|
||||
my $outkey = _pgp_armor_data('PUBLIC KEY BLOCK', $keycontent);
|
||||
print $outkeyring $outkey
|
||||
or
|
||||
uscan_warn("Can't write ASCII-armored keyring to $keyring");
|
||||
close $outkeyring or uscan_warn("Failed to close $keyring");
|
||||
}
|
||||
|
||||
uscan_warn("Generated upstream signing keyring: $keyring");
|
||||
move $binkeyring, "$binkeyring.backup";
|
||||
uscan_verbose(
|
||||
"Renamed upstream binary signing keyring: $binkeyring.backup");
|
||||
}
|
||||
}
|
||||
|
||||
# Need to convert an armored key to binary for use by gpgv
|
||||
if (defined $keyring) {
|
||||
uscan_verbose("Found upstream signing keyring: $keyring");
|
||||
if ($keyring =~ m/\.asc$/ && !defined $havesopv)
|
||||
{ # binary keyring is only necessary for gpgv:
|
||||
my $pgpworkdir = tempdir(CLEANUP => 1);
|
||||
my $newkeyring = "$pgpworkdir/upstream-signing-key.pgp";
|
||||
open my $inkeyring, '<', $keyring
|
||||
or uscan_die("Can't open keyring file $keyring");
|
||||
read $inkeyring, my $keycontent, -s $inkeyring;
|
||||
close $inkeyring;
|
||||
my $binkey
|
||||
= _pgp_unarmor_data('PUBLIC KEY BLOCK', $keycontent, $keyring);
|
||||
if ($binkey) {
|
||||
open my $outkeyring, '>:raw', $newkeyring
|
||||
or uscan_die("Can't write to temporary keyring $newkeyring");
|
||||
print $outkeyring $binkey
|
||||
or uscan_die("Can't write $newkeyring");
|
||||
close $outkeyring or uscan_die("Can't close $newkeyring");
|
||||
$keyring = $newkeyring;
|
||||
} else {
|
||||
uscan_die("Failed to dearmor key(s) from $keyring");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
# Return undef if not key found
|
||||
else {
|
||||
return undef;
|
||||
}
|
||||
my $self = bless {
|
||||
keyring => $keyring,
|
||||
gpgv => $havegpgv,
|
||||
sopv => $havesopv,
|
||||
}, $class;
|
||||
return $self;
|
||||
}
|
||||
|
||||
sub verify {
|
||||
my ($self, $sigfile, $newfile) = @_;
|
||||
uscan_verbose(
|
||||
"Verifying OpenPGP self signature of $newfile and extract $sigfile");
|
||||
if ($self->{sopv}) {
|
||||
spawn(
|
||||
exec => [$self->{sopv}, 'inline-verify', $self->{keyring}],
|
||||
from_file => $newfile,
|
||||
to_file => $sigfile,
|
||||
wait_child => 1
|
||||
) or uscan_die("OpenPGP signature did not verify.");
|
||||
} else {
|
||||
unless (
|
||||
uscan_exec_no_fail(
|
||||
$self->{gpgv},
|
||||
'--homedir' => '/dev/null',
|
||||
'--keyring' => $self->{keyring},
|
||||
'-o' => "$sigfile",
|
||||
"$newfile"
|
||||
) >> 8 == 0
|
||||
) {
|
||||
uscan_die("OpenPGP signature did not verify.");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
sub verifyv {
|
||||
my ($self, $sigfile, $base) = @_;
|
||||
uscan_verbose("Verifying OpenPGP signature $sigfile for $base");
|
||||
if ($self->{sopv}) {
|
||||
spawn(
|
||||
exec => [$self->{sopv}, 'verify', $sigfile, $self->{keyring}],
|
||||
from_file => $base,
|
||||
wait_child => 1
|
||||
) or uscan_die("OpenPGP signature did not verify.");
|
||||
} else {
|
||||
unless (
|
||||
uscan_exec_no_fail(
|
||||
$self->{gpgv},
|
||||
'--homedir' => '/dev/null',
|
||||
'--keyring' => $self->{keyring},
|
||||
$sigfile, $base
|
||||
) >> 8 == 0
|
||||
) {
|
||||
uscan_die("OpenPGP signature did not verify.");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
sub verify_git {
|
||||
my ($self, $gitdir, $tag, $git_upstream) = @_;
|
||||
my $commit;
|
||||
my @dir = $git_upstream ? () : ('--git-dir', $gitdir);
|
||||
spawn(
|
||||
exec => ['git', @dir, 'show-ref', $tag],
|
||||
to_string => \$commit
|
||||
);
|
||||
uscan_die "git tag not found" unless ($commit);
|
||||
$commit =~ s/\s.*$//;
|
||||
chomp $commit;
|
||||
my $file;
|
||||
spawn(
|
||||
exec => ['git', @dir, 'cat-file', '-p', $commit],
|
||||
to_string => \$file
|
||||
);
|
||||
my $dir;
|
||||
spawn(exec => ['mktemp', '-d'], to_string => \$dir);
|
||||
chomp $dir;
|
||||
|
||||
unless ($file =~ /^(.*?\n)(\-+\s*BEGIN PGP SIGNATURE\s*\-+.*)$/s) {
|
||||
uscan_die "Tag $tag is not signed";
|
||||
}
|
||||
open F, ">$dir/txt" or die $!;
|
||||
open S, ">$dir/sig" or die $!;
|
||||
print F $1;
|
||||
print S $2;
|
||||
close F;
|
||||
close S;
|
||||
|
||||
if ($self->{sopv}) {
|
||||
spawn(
|
||||
exec => [$self->{sopv}, 'verify', "$dir/sig", $self->{keyring}],
|
||||
from_file => "$dir/txt",
|
||||
wait_child => 1
|
||||
) or uscan_die("OpenPGP signature did not verify");
|
||||
} else {
|
||||
unless (
|
||||
uscan_exec_no_fail(
|
||||
$self->{gpgv},
|
||||
'--homedir' => '/dev/null',
|
||||
'--keyring' => $self->{keyring},
|
||||
"$dir/sig", "$dir/txt"
|
||||
) >> 8 == 0
|
||||
) {
|
||||
uscan_die("OpenPGP signature did not verify.");
|
||||
}
|
||||
}
|
||||
remove_tree($dir);
|
||||
}
|
||||
|
||||
1;
|
129
lib/Devscripts/Uscan/Output.pm
Normal file
129
lib/Devscripts/Uscan/Output.pm
Normal file
|
@ -0,0 +1,129 @@
|
|||
package Devscripts::Uscan::Output;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Exporter 'import';
|
||||
use File::Basename;
|
||||
|
||||
our @EXPORT = (
|
||||
@Devscripts::Output::EXPORT, qw(
|
||||
uscan_msg uscan_verbose dehs_verbose uscan_warn uscan_debug uscan_msg_raw
|
||||
uscan_extra_debug uscan_die dehs_output $dehs $verbose $dehs_tags
|
||||
$dehs_start_output $dehs_end_output $found
|
||||
));
|
||||
|
||||
# ACCESSORS
|
||||
our ($dehs, $dehs_tags, $dehs_start_output, $dehs_end_output, $found)
|
||||
= (0, {}, 0, 0);
|
||||
|
||||
our $progname = basename($0);
|
||||
|
||||
sub printwarn_raw {
|
||||
my ($msg, $w) = @_;
|
||||
if ($w or $dehs) {
|
||||
print STDERR "$msg";
|
||||
} else {
|
||||
print "$msg";
|
||||
}
|
||||
}
|
||||
|
||||
sub printwarn {
|
||||
my ($msg, $w) = @_;
|
||||
chomp $msg;
|
||||
printwarn_raw("$msg\n", $w);
|
||||
}
|
||||
|
||||
sub uscan_msg_raw {
|
||||
printwarn_raw($_[0]);
|
||||
}
|
||||
|
||||
sub uscan_msg {
|
||||
printwarn($_[0]);
|
||||
}
|
||||
|
||||
sub uscan_verbose {
|
||||
ds_verbose($_[0], $dehs);
|
||||
}
|
||||
|
||||
sub uscan_debug {
|
||||
ds_debug($_[0], $dehs);
|
||||
}
|
||||
|
||||
sub uscan_extra_debug {
|
||||
ds_extra_debug($_[0], $dehs);
|
||||
}
|
||||
|
||||
sub dehs_verbose ($) {
|
||||
my $msg = $_[0];
|
||||
push @{ $dehs_tags->{'messages'} }, "$msg\n";
|
||||
uscan_verbose($msg);
|
||||
}
|
||||
|
||||
sub uscan_warn ($) {
|
||||
my $msg = $_[0];
|
||||
push @{ $dehs_tags->{'warnings'} }, $msg if $dehs;
|
||||
printwarn("$progname warn: $msg" . &Devscripts::Output::who_called, 1);
|
||||
}
|
||||
|
||||
sub uscan_die ($) {
|
||||
my $msg = $_[0];
|
||||
if ($dehs) {
|
||||
$dehs_tags = { 'errors' => "$msg" };
|
||||
$dehs_end_output = 1;
|
||||
dehs_output();
|
||||
}
|
||||
$msg = "$progname die: $msg" . &Devscripts::Output::who_called;
|
||||
if ($Devscripts::Output::die_on_error) {
|
||||
die $msg;
|
||||
}
|
||||
printwarn($msg, 1);
|
||||
}
|
||||
|
||||
sub dehs_output () {
|
||||
return unless $dehs;
|
||||
|
||||
if (!$dehs_start_output) {
|
||||
print "<dehs>\n";
|
||||
$dehs_start_output = 1;
|
||||
}
|
||||
|
||||
for my $tag (
|
||||
qw(package debian-uversion debian-mangled-uversion
|
||||
upstream-version upstream-url decoded-checksum
|
||||
status target target-path messages warnings errors)
|
||||
) {
|
||||
if (exists $dehs_tags->{$tag}) {
|
||||
if (ref $dehs_tags->{$tag} eq "ARRAY") {
|
||||
foreach my $entry (@{ $dehs_tags->{$tag} }) {
|
||||
$entry =~ s/</</g;
|
||||
$entry =~ s/>/>/g;
|
||||
$entry =~ s/&/&/g;
|
||||
print "<$tag>$entry</$tag>\n";
|
||||
}
|
||||
} else {
|
||||
$dehs_tags->{$tag} =~ s/</</g;
|
||||
$dehs_tags->{$tag} =~ s/>/>/g;
|
||||
$dehs_tags->{$tag} =~ s/&/&/g;
|
||||
print "<$tag>$dehs_tags->{$tag}</$tag>\n";
|
||||
}
|
||||
}
|
||||
}
|
||||
foreach my $cmp (@{ $dehs_tags->{'component-name'} }) {
|
||||
print qq'<component id="$cmp">\n';
|
||||
foreach my $tag (
|
||||
qw(debian-uversion debian-mangled-uversion
|
||||
upstream-version upstream-url target target-path)
|
||||
) {
|
||||
my $v = shift @{ $dehs_tags->{"component-$tag"} };
|
||||
print " <component-$tag>$v</component-$tag>\n" if $v;
|
||||
}
|
||||
print "</component>\n";
|
||||
}
|
||||
if ($dehs_end_output) {
|
||||
print "</dehs>\n";
|
||||
}
|
||||
|
||||
# Don't repeat output
|
||||
$dehs_tags = {};
|
||||
}
|
||||
1;
|
475
lib/Devscripts/Uscan/Utils.pm
Normal file
475
lib/Devscripts/Uscan/Utils.pm
Normal file
|
@ -0,0 +1,475 @@
|
|||
package Devscripts::Uscan::Utils;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Utils;
|
||||
use Exporter 'import';
|
||||
|
||||
our @EXPORT = (
|
||||
qw(fix_href recursive_regex_dir newest_dir get_compression
|
||||
get_suffix get_priority quoted_regex_parse safe_replace mangle
|
||||
uscan_exec uscan_exec_no_fail)
|
||||
);
|
||||
|
||||
#######################################################################
|
||||
# {{{ code 5: utility functions (download)
|
||||
#######################################################################
|
||||
sub fix_href ($) {
|
||||
my ($href) = @_;
|
||||
|
||||
# Remove newline (code moved from outside fix_href)
|
||||
$href =~ s/\n//g;
|
||||
|
||||
# Remove whitespace from URLs:
|
||||
# https://www.w3.org/TR/html5/links.html#links-created-by-a-and-area-elements
|
||||
$href =~ s/^\s+//;
|
||||
$href =~ s/\s+$//;
|
||||
|
||||
return $href;
|
||||
}
|
||||
|
||||
sub recursive_regex_dir ($$$$$$) {
|
||||
|
||||
# If return '', parent code to cause return 1
|
||||
my ($line, $base, $dirversionmangle, $watchfile, $lineptr,
|
||||
$download_version)
|
||||
= @_;
|
||||
|
||||
$base =~ m%^(\w+://[^/]+)/(.*)$%;
|
||||
my $site = $1;
|
||||
my @dirs = ();
|
||||
if (defined $2) {
|
||||
@dirs = split /(\/)/, $2;
|
||||
}
|
||||
my $dir = '/';
|
||||
|
||||
foreach my $dirpattern (@dirs) {
|
||||
if ($dirpattern =~ /\(.*\)/) {
|
||||
uscan_verbose "dir=>$dir dirpattern=>$dirpattern";
|
||||
my $newest_dir = newest_dir($line, $site, $dir, $dirpattern,
|
||||
$dirversionmangle, $watchfile, $lineptr, $download_version);
|
||||
uscan_verbose "newest_dir => '$newest_dir'";
|
||||
if ($newest_dir ne '') {
|
||||
$dir .= "$newest_dir";
|
||||
} else {
|
||||
uscan_debug "No \$newest_dir";
|
||||
return '';
|
||||
}
|
||||
} else {
|
||||
$dir .= "$dirpattern";
|
||||
}
|
||||
}
|
||||
return $site . $dir;
|
||||
}
|
||||
|
||||
# very similar to code above
|
||||
sub newest_dir ($$$$$$$$) {
|
||||
|
||||
# return string $newdir as success
|
||||
# return string '' if error, to cause grand parent code to return 1
|
||||
my ($line, $site, $dir, $pattern, $dirversionmangle, $watchfile,
|
||||
$lineptr, $download_version)
|
||||
= @_;
|
||||
my ($newdir);
|
||||
uscan_verbose "Requesting URL:\n $site$dir";
|
||||
if ($site =~ m%^http(s)?://%) {
|
||||
require Devscripts::Uscan::http;
|
||||
$newdir = Devscripts::Uscan::http::http_newdir($1, @_);
|
||||
} elsif ($site =~ m%^ftp://%) {
|
||||
require Devscripts::Uscan::ftp;
|
||||
$newdir = Devscripts::Uscan::ftp::ftp_newdir(@_);
|
||||
} else {
|
||||
# Neither HTTP nor FTP site
|
||||
uscan_warn "neither HTTP nor FTP site, impossible case for newdir().";
|
||||
$newdir = '';
|
||||
}
|
||||
return $newdir;
|
||||
}
|
||||
#######################################################################
|
||||
# }}} code 5: utility functions (download)
|
||||
#######################################################################
|
||||
|
||||
#######################################################################
|
||||
# {{{ code 6: utility functions (compression)
|
||||
#######################################################################
|
||||
# Get legal values for compression
|
||||
sub get_compression ($) {
|
||||
my $compression = $_[0];
|
||||
my $canonical_compression;
|
||||
|
||||
# be liberal in what you accept...
|
||||
my %opt2comp = (
|
||||
gz => 'gzip',
|
||||
gzip => 'gzip',
|
||||
bz2 => 'bzip2',
|
||||
bzip2 => 'bzip2',
|
||||
lzma => 'lzma',
|
||||
xz => 'xz',
|
||||
zip => 'zip',
|
||||
zst => 'zst',
|
||||
zstd => 'zst',
|
||||
);
|
||||
|
||||
# Normalize compression methods to the names used by Dpkg::Compression
|
||||
if (exists $opt2comp{$compression}) {
|
||||
$canonical_compression = $opt2comp{$compression};
|
||||
} else {
|
||||
uscan_die "$progname: invalid compression, $compression given.";
|
||||
}
|
||||
return $canonical_compression;
|
||||
}
|
||||
|
||||
# Get legal values for compression suffix
|
||||
sub get_suffix ($) {
|
||||
my $compression = $_[0];
|
||||
my $canonical_suffix;
|
||||
|
||||
# be liberal in what you accept...
|
||||
my %opt2suffix = (
|
||||
gz => 'gz',
|
||||
gzip => 'gz',
|
||||
bz2 => 'bz2',
|
||||
bzip2 => 'bz2',
|
||||
lzma => 'lzma',
|
||||
xz => 'xz',
|
||||
zip => 'zip',
|
||||
zst => 'zst',
|
||||
zstd => 'zst',
|
||||
);
|
||||
|
||||
# Normalize compression methods to the names used by Dpkg::Compression
|
||||
if (exists $opt2suffix{$compression}) {
|
||||
$canonical_suffix = $opt2suffix{$compression};
|
||||
} elsif ($compression eq 'default') {
|
||||
require Devscripts::MkOrigtargz::Config;
|
||||
return &Devscripts::MkOrigtargz::Config::default_compression;
|
||||
} else {
|
||||
uscan_die "$progname: invalid suffix, $compression given.";
|
||||
}
|
||||
return $canonical_suffix;
|
||||
}
|
||||
|
||||
# Get compression priority
|
||||
sub get_priority ($) {
|
||||
my $href = $_[0];
|
||||
my $priority = 0;
|
||||
if ($href =~ m/\.tar\.gz/i) {
|
||||
$priority = 1;
|
||||
}
|
||||
if ($href =~ m/\.tar\.bz2/i) {
|
||||
$priority = 2;
|
||||
}
|
||||
if ($href =~ m/\.tar\.lzma/i) {
|
||||
$priority = 3;
|
||||
}
|
||||
#if ($href =~ m/\.tar\.zstd?/i) {
|
||||
# $priority = 4;
|
||||
#}
|
||||
if ($href =~ m/\.tar\.xz/i) {
|
||||
$priority = 4;
|
||||
}
|
||||
return $priority;
|
||||
}
|
||||
#######################################################################
|
||||
# }}} code 6: utility functions (compression)
|
||||
#######################################################################
|
||||
|
||||
#######################################################################
|
||||
# {{{ code 7: utility functions (regex)
|
||||
#######################################################################
|
||||
sub quoted_regex_parse($) {
|
||||
my $pattern = shift;
|
||||
my %closers = ('{', '}', '[', ']', '(', ')', '<', '>');
|
||||
|
||||
$pattern =~ /^(s|tr|y)(.)(.*)$/;
|
||||
my ($sep, $rest) = ($2, $3 || '');
|
||||
my $closer = $closers{$sep};
|
||||
|
||||
my $parsed_ok = 1;
|
||||
my $regexp = '';
|
||||
my $replacement = '';
|
||||
my $flags = '';
|
||||
my $open = 1;
|
||||
my $last_was_escape = 0;
|
||||
my $in_replacement = 0;
|
||||
|
||||
for my $char (split //, $rest) {
|
||||
if ($char eq $sep and !$last_was_escape) {
|
||||
$open++;
|
||||
if ($open == 1) {
|
||||
if ($in_replacement) {
|
||||
|
||||
# Separator after end of replacement
|
||||
uscan_warn "Extra \"$sep\" after end of replacement.";
|
||||
$parsed_ok = 0;
|
||||
last;
|
||||
} else {
|
||||
$in_replacement = 1;
|
||||
}
|
||||
} else {
|
||||
if ($open > 1) {
|
||||
if ($in_replacement) {
|
||||
$replacement .= $char;
|
||||
} else {
|
||||
$regexp .= $char;
|
||||
}
|
||||
}
|
||||
}
|
||||
} elsif ($char eq $closer and !$last_was_escape) {
|
||||
$open--;
|
||||
if ($open > 0) {
|
||||
if ($in_replacement) {
|
||||
$replacement .= $char;
|
||||
} else {
|
||||
$regexp .= $char;
|
||||
}
|
||||
} elsif ($open < 0) {
|
||||
uscan_warn "Extra \"$closer\" after end of replacement.";
|
||||
$parsed_ok = 0;
|
||||
last;
|
||||
}
|
||||
} else {
|
||||
if ($in_replacement) {
|
||||
if ($open) {
|
||||
$replacement .= $char;
|
||||
} else {
|
||||
$flags .= $char;
|
||||
}
|
||||
} else {
|
||||
if ($open) {
|
||||
$regexp .= $char;
|
||||
} elsif ($char !~ m/\s/) {
|
||||
uscan_warn
|
||||
"Non-whitespace between <...> and <...> (or similars).";
|
||||
$parsed_ok = 0;
|
||||
last;
|
||||
}
|
||||
|
||||
# skip if blanks between <...> and <...> (or similars)
|
||||
}
|
||||
}
|
||||
|
||||
# Don't treat \\ as an escape
|
||||
$last_was_escape = ($char eq '\\' and !$last_was_escape);
|
||||
}
|
||||
|
||||
unless ($in_replacement and $open == 0) {
|
||||
uscan_warn "Empty replacement string.";
|
||||
$parsed_ok = 0;
|
||||
}
|
||||
|
||||
return ($parsed_ok, $regexp, $replacement, $flags);
|
||||
}
|
||||
|
||||
sub safe_replace($$) {
|
||||
my ($in, $pat) = @_;
|
||||
eval "uscan_debug \"safe_replace input=\\\"\$\$in\\\"\\n\"";
|
||||
$pat =~ s/^\s*(.*?)\s*$/$1/;
|
||||
|
||||
$pat =~ /^(s|tr|y)(.)/;
|
||||
my ($op, $sep) = ($1, $2 || '');
|
||||
my $esc = "\Q$sep\E";
|
||||
my ($parsed_ok, $regexp, $replacement, $flags);
|
||||
|
||||
if ($sep eq '{' or $sep eq '(' or $sep eq '[' or $sep eq '<') {
|
||||
($parsed_ok, $regexp, $replacement, $flags) = quoted_regex_parse($pat);
|
||||
|
||||
unless ($parsed_ok) {
|
||||
uscan_warn "stop mangling: rule=\"$pat\"\n"
|
||||
. " mangling rule with <...>, (...), {...} failed.";
|
||||
return 0;
|
||||
}
|
||||
} elsif ($pat
|
||||
!~ /^(?:s|tr|y)$esc((?:\\.|[^\\$esc])*)$esc((?:\\.|[^\\$esc])*)$esc([a-z]*)$/
|
||||
) {
|
||||
$sep = "/" if $sep eq '';
|
||||
uscan_warn "stop mangling: rule=\"$pat\"\n"
|
||||
. " rule doesn't match \"(s|tr|y)$sep.*$sep.*$sep\[a-z\]*\" (or similar).";
|
||||
return 0;
|
||||
} else {
|
||||
($regexp, $replacement, $flags) = ($1, $2, $3);
|
||||
}
|
||||
|
||||
uscan_debug
|
||||
"safe_replace with regexp=\"$regexp\", replacement=\"$replacement\", and flags=\"$flags\"";
|
||||
my $safeflags = $flags;
|
||||
if ($op eq 'tr' or $op eq 'y') {
|
||||
$safeflags =~ tr/cds//cd;
|
||||
if ($safeflags ne $flags) {
|
||||
uscan_warn "stop mangling: rule=\"$pat\"\n"
|
||||
. " flags must consist of \"cds\" only.";
|
||||
return 0;
|
||||
}
|
||||
|
||||
$regexp =~ s/\\(.)/$1/g;
|
||||
$replacement =~ s/\\(.)/$1/g;
|
||||
|
||||
$regexp =~ s/([^-])/'\\x' . unpack 'H*', $1/ge;
|
||||
$replacement =~ s/([^-])/'\\x' . unpack 'H*', $1/ge;
|
||||
|
||||
eval "\$\$in =~ tr<$regexp><$replacement>$flags;";
|
||||
|
||||
if ($@) {
|
||||
uscan_warn "stop mangling: rule=\"$pat\"\n"
|
||||
. " mangling \"tr\" or \"y\" rule execution failed.";
|
||||
return 0;
|
||||
} else {
|
||||
return 1;
|
||||
}
|
||||
} else {
|
||||
$safeflags =~ tr/gix//cd;
|
||||
if ($safeflags ne $flags) {
|
||||
uscan_warn "stop mangling: rule=\"$pat\"\n"
|
||||
. " flags must consist of \"gix\" only.";
|
||||
return 0;
|
||||
}
|
||||
|
||||
my $global = ($flags =~ s/g//);
|
||||
$flags = "(?$flags)" if length $flags;
|
||||
|
||||
my $slashg;
|
||||
if ($regexp =~ /(?<!\\)(\\\\)*\\G/) {
|
||||
$slashg = 1;
|
||||
|
||||
# if it's not initial, it is too dangerous
|
||||
if ($regexp =~ /^.*[^\\](\\\\)*\\G/) {
|
||||
uscan_warn "stop mangling: rule=\"$pat\"\n"
|
||||
. " dangerous use of \\G with regexp=\"$regexp\".";
|
||||
return 0;
|
||||
}
|
||||
}
|
||||
|
||||
# Behave like Perl and treat e.g. "\." in replacement as "."
|
||||
# We allow the case escape characters to remain and
|
||||
# process them later
|
||||
$replacement =~ s/(^|[^\\])\\([^luLUE])/$1$2/g;
|
||||
|
||||
# Unescape escaped separator characters
|
||||
$replacement =~ s/\\\Q$sep\E/$sep/g;
|
||||
|
||||
# If bracketing quotes were used, also unescape the
|
||||
# closing version
|
||||
### {{ ### (FOOL EDITOR for non-quoted kets)
|
||||
$replacement =~ s/\\\Q}\E/}/g if $sep eq '{';
|
||||
$replacement =~ s/\\\Q]\E/]/g if $sep eq '[';
|
||||
$replacement =~ s/\\\Q)\E/)/g if $sep eq '(';
|
||||
$replacement =~ s/\\\Q>\E/>/g if $sep eq '<';
|
||||
|
||||
# The replacement below will modify $replacement so keep
|
||||
# a copy. We'll need to restore it to the current value if
|
||||
# the global flag was set on the input pattern.
|
||||
my $orig_replacement = $replacement;
|
||||
|
||||
my ($first, $last, $pos, $zerowidth, $matched, @captures) = (0, -1, 0);
|
||||
while (1) {
|
||||
eval {
|
||||
# handle errors due to unsafe constructs in $regexp
|
||||
no re 'eval';
|
||||
|
||||
# restore position
|
||||
pos($$in) = $pos if $pos;
|
||||
|
||||
if ($zerowidth) {
|
||||
|
||||
# previous match was a zero-width match, simulate it to set
|
||||
# the internal flag that avoids the infinite loop
|
||||
$$in =~ /()/g;
|
||||
}
|
||||
|
||||
# Need to use /g to make it use and save pos()
|
||||
$matched = ($$in =~ /$flags$regexp/g);
|
||||
|
||||
if ($matched) {
|
||||
|
||||
# save position and size of the match
|
||||
my $oldpos = $pos;
|
||||
$pos = pos($$in);
|
||||
($first, $last) = ($-[0], $+[0]);
|
||||
|
||||
if ($slashg) {
|
||||
|
||||
# \G in the match, weird things can happen
|
||||
$zerowidth = ($pos == $oldpos);
|
||||
|
||||
# For example, matching without a match
|
||||
$matched = 0
|
||||
if ( not defined $first
|
||||
or not defined $last);
|
||||
} else {
|
||||
$zerowidth = ($last - $first == 0);
|
||||
}
|
||||
for my $i (0 .. $#-) {
|
||||
$captures[$i] = substr $$in, $-[$i], $+[$i] - $-[$i];
|
||||
}
|
||||
}
|
||||
};
|
||||
if ($@) {
|
||||
uscan_warn "stop mangling: rule=\"$pat\"\n"
|
||||
. " mangling \"s\" rule execution failed.";
|
||||
return 0;
|
||||
}
|
||||
|
||||
# No match; leave the original string untouched but return
|
||||
# success as there was nothing wrong with the pattern
|
||||
return 1 unless $matched;
|
||||
|
||||
# Replace $X
|
||||
$replacement
|
||||
=~ s/[\$\\](\d)/defined $captures[$1] ? $captures[$1] : ''/ge;
|
||||
$replacement
|
||||
=~ s/\$\{(\d)\}/defined $captures[$1] ? $captures[$1] : ''/ge;
|
||||
$replacement =~ s/\$&/$captures[0]/g;
|
||||
|
||||
# Make \l etc escapes work
|
||||
$replacement =~ s/\\l(.)/lc $1/e;
|
||||
$replacement =~ s/\\L(.*?)(\\E|\z)/lc $1/e;
|
||||
$replacement =~ s/\\u(.)/uc $1/e;
|
||||
$replacement =~ s/\\U(.*?)(\\E|\z)/uc $1/e;
|
||||
|
||||
# Actually do the replacement
|
||||
substr $$in, $first, $last - $first, $replacement;
|
||||
|
||||
# Update position
|
||||
$pos += length($replacement) - ($last - $first);
|
||||
|
||||
if ($global) {
|
||||
$replacement = $orig_replacement;
|
||||
} else {
|
||||
last;
|
||||
}
|
||||
}
|
||||
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
|
||||
# call this as
|
||||
# if mangle($watchfile, \$line, 'uversionmangle:',
|
||||
# \@{$options{'uversionmangle'}}, \$version) {
|
||||
# return 1;
|
||||
# }
|
||||
sub mangle($$$$$) {
|
||||
my ($watchfile, $lineptr, $name, $rulesptr, $verptr) = @_;
|
||||
foreach my $pat (@{$rulesptr}) {
|
||||
if (!safe_replace($verptr, $pat)) {
|
||||
uscan_warn "In $watchfile, potentially"
|
||||
. " unsafe or malformed $name"
|
||||
. " pattern:\n '$pat'"
|
||||
. " found. Skipping watchline\n"
|
||||
. " $$lineptr";
|
||||
return 1;
|
||||
}
|
||||
uscan_debug "After $name $$verptr";
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
*uscan_exec_no_fail = \&ds_exec_no_fail;
|
||||
|
||||
*uscan_exec = \&ds_exec;
|
||||
|
||||
#######################################################################
|
||||
# }}} code 7: utility functions (regex)
|
||||
#######################################################################
|
||||
|
||||
1;
|
517
lib/Devscripts/Uscan/WatchFile.pm
Normal file
517
lib/Devscripts/Uscan/WatchFile.pm
Normal file
|
@ -0,0 +1,517 @@
|
|||
|
||||
=head1 NAME
|
||||
|
||||
Devscripts::Uscan::WatchFile - watchfile object for L<uscan>
|
||||
|
||||
=head1 SYNOPSIS
|
||||
|
||||
use Devscripts::Uscan::Config;
|
||||
use Devscripts::Uscan::WatchFile;
|
||||
|
||||
my $config = Devscripts::Uscan::Config->new({
|
||||
# Uscan config parameters. Example:
|
||||
destdir => '..',
|
||||
});
|
||||
|
||||
# You can use Devscripts::Uscan::FindFiles to find watchfiles
|
||||
|
||||
my $wf = Devscripts::Uscan::WatchFile->new({
|
||||
config => $config,
|
||||
package => $package,
|
||||
pkg_dir => $pkg_dir,
|
||||
pkg_version => $version,
|
||||
watchfile => $watchfile,
|
||||
});
|
||||
return $wf->status if ( $wf->status );
|
||||
|
||||
# Do the job
|
||||
return $wf->process_lines;
|
||||
|
||||
=head1 DESCRIPTION
|
||||
|
||||
Uscan class to parse watchfiles.
|
||||
|
||||
=head1 METHODS
|
||||
|
||||
=head2 new() I<(Constructor)>
|
||||
|
||||
Parse watch file and creates L<Devscripts::Uscan::WatchLine> objects for
|
||||
each line.
|
||||
|
||||
=head3 Required parameters
|
||||
|
||||
=over
|
||||
|
||||
=item config: L<Devscripts::Uscan::Config> object
|
||||
|
||||
=item package: Debian package name
|
||||
|
||||
=item pkg_dir: Working directory
|
||||
|
||||
=item pkg_version: Current Debian package version
|
||||
|
||||
=back
|
||||
|
||||
=head2 Main accessors
|
||||
|
||||
=over
|
||||
|
||||
=item watchlines: ref to the array that contains watchlines objects
|
||||
|
||||
=item watch_version: format version of the watchfile
|
||||
|
||||
=back
|
||||
|
||||
=head2 process_lines()
|
||||
|
||||
Method that launches Devscripts::Uscan::WatchLine::process() on each watchline.
|
||||
|
||||
=head1 SEE ALSO
|
||||
|
||||
L<uscan>, L<Devscripts::Uscan::WatchLine>, L<Devscripts::Uscan::Config>,
|
||||
L<Devscripts::Uscan::FindFiles>
|
||||
|
||||
=head1 AUTHOR
|
||||
|
||||
B<uscan> was originally written by Christoph Lameter
|
||||
E<lt>clameter@debian.orgE<gt> (I believe), modified by Julian Gilbey
|
||||
E<lt>jdg@debian.orgE<gt>. HTTP support was added by Piotr Roszatycki
|
||||
E<lt>dexter@debian.orgE<gt>. B<uscan> was rewritten in Perl by Julian Gilbey.
|
||||
Xavier Guimard E<lt>yadd@debian.orgE<gt> rewrote uscan in object
|
||||
oriented Perl.
|
||||
|
||||
=head1 COPYRIGHT AND LICENSE
|
||||
|
||||
Copyright 2002-2006 by Julian Gilbey <jdg@debian.org>,
|
||||
2018 by Xavier Guimard <yadd@debian.org>
|
||||
|
||||
This program is free software; you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation; either version 2 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
=cut
|
||||
|
||||
package Devscripts::Uscan::WatchFile;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Uscan::Downloader;
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Uscan::WatchLine;
|
||||
use Dpkg::Version;
|
||||
use File::Copy qw/copy move/;
|
||||
use List::Util qw/first/;
|
||||
use Moo;
|
||||
|
||||
use constant {
|
||||
ANY_VERSION => '(?:[-_]?[Vv]?(\d[\-+\.:\~\da-zA-Z]*))',
|
||||
ARCHIVE_EXT =>
|
||||
'(?i)(?:\.(?:tar\.xz|tar\.bz2|tar\.gz|tar\.zstd?|zip|tgz|tbz|txz))',
|
||||
DEB_EXT => '(?:[\+~](debian|dfsg|ds|deb)(\.)?(\d+)?$)',
|
||||
};
|
||||
use constant SIGNATURE_EXT => ARCHIVE_EXT . '(?:\.(?:asc|pgp|gpg|sig|sign))';
|
||||
|
||||
# Required new() parameters
|
||||
has config => (is => 'rw', required => 1);
|
||||
has package => (is => 'ro', required => 1); # Debian package
|
||||
has pkg_dir => (is => 'ro', required => 1);
|
||||
has pkg_version => (is => 'ro', required => 1);
|
||||
has bare => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub { $_[0]->config->bare });
|
||||
has download => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => sub { $_[0]->config->download });
|
||||
has downloader => (
|
||||
is => 'ro',
|
||||
lazy => 1,
|
||||
default => sub {
|
||||
Devscripts::Uscan::Downloader->new({
|
||||
timeout => $_[0]->config->timeout,
|
||||
agent => $_[0]->config->user_agent,
|
||||
pasv => $_[0]->config->pasv,
|
||||
destdir => $_[0]->config->destdir,
|
||||
headers => $_[0]->config->http_header,
|
||||
});
|
||||
},
|
||||
);
|
||||
has signature => (
|
||||
is => 'rw',
|
||||
required => 1,
|
||||
lazy => 1,
|
||||
default => sub { $_[0]->config->signature });
|
||||
has watchfile => (is => 'ro', required => 1); # usually debian/watch
|
||||
|
||||
# Internal attributes
|
||||
has group => (is => 'rw', default => sub { [] });
|
||||
has origcount => (is => 'rw');
|
||||
has origtars => (is => 'rw', default => sub { [] });
|
||||
has status => (is => 'rw', default => sub { 0 });
|
||||
has watch_version => (is => 'rw');
|
||||
has watchlines => (is => 'rw', default => sub { [] });
|
||||
|
||||
# Values shared between lines
|
||||
has shared => (
|
||||
is => 'rw',
|
||||
lazy => 1,
|
||||
default => \&new_shared,
|
||||
);
|
||||
|
||||
sub new_shared {
|
||||
return {
|
||||
bare => $_[0]->bare,
|
||||
components => [],
|
||||
common_newversion => undef,
|
||||
common_mangled_newversion => undef,
|
||||
download => $_[0]->download,
|
||||
download_version => undef,
|
||||
origcount => undef,
|
||||
origtars => [],
|
||||
previous_download_available => undef,
|
||||
previous_newversion => undef,
|
||||
previous_newfile_base => undef,
|
||||
previous_sigfile_base => undef,
|
||||
signature => $_[0]->signature,
|
||||
uscanlog => undef,
|
||||
};
|
||||
}
|
||||
has keyring => (
|
||||
is => 'ro',
|
||||
default => sub { Devscripts::Uscan::Keyring->new });
|
||||
|
||||
sub BUILD {
|
||||
my ($self, $args) = @_;
|
||||
my $watch_version = 0;
|
||||
my $nextline;
|
||||
$dehs_tags = {};
|
||||
|
||||
uscan_verbose "Process watch file at: $args->{watchfile}\n"
|
||||
. " package = $args->{package}\n"
|
||||
. " version = $args->{pkg_version}\n"
|
||||
. " pkg_dir = $args->{pkg_dir}";
|
||||
|
||||
$self->origcount(0); # reset to 0 for each watch file
|
||||
unless (open WATCH, $args->{watchfile}) {
|
||||
uscan_warn "could not open $args->{watchfile}: $!";
|
||||
return 1;
|
||||
}
|
||||
|
||||
my $lineNumber = 0;
|
||||
while (<WATCH>) {
|
||||
next if /^\s*\#/;
|
||||
next if /^\s*$/;
|
||||
s/^\s*//;
|
||||
|
||||
CHOMP:
|
||||
|
||||
# Reassemble lines split using \
|
||||
chomp;
|
||||
if (s/(?<!\\)\\$//) {
|
||||
if (eof(WATCH)) {
|
||||
uscan_warn
|
||||
"$args->{watchfile} ended with \\; skipping last line";
|
||||
$self->status(1);
|
||||
last;
|
||||
}
|
||||
if ($watch_version > 3) {
|
||||
|
||||
# drop leading \s only if version 4
|
||||
$nextline = <WATCH>;
|
||||
$nextline =~ s/^\s*//;
|
||||
$_ .= $nextline;
|
||||
} else {
|
||||
$_ .= <WATCH>;
|
||||
}
|
||||
goto CHOMP;
|
||||
}
|
||||
|
||||
# "version" must be the first field
|
||||
if (!$watch_version) {
|
||||
|
||||
# Looking for "version" field.
|
||||
if (/^version\s*=\s*(\d+)(\s|$)/) { # Found
|
||||
$watch_version = $1;
|
||||
|
||||
# Note that version=1 watchfiles have no "version" field so
|
||||
# authorizated values are >= 2 and <= CURRENT_WATCHFILE_VERSION
|
||||
if ( $watch_version < 2
|
||||
or $watch_version
|
||||
> $Devscripts::Uscan::Config::CURRENT_WATCHFILE_VERSION) {
|
||||
# "version" field found but has no authorizated value
|
||||
uscan_warn
|
||||
"$args->{watchfile} version number is unrecognised; skipping watch file";
|
||||
last;
|
||||
}
|
||||
|
||||
# Next line
|
||||
next;
|
||||
}
|
||||
|
||||
# version=1 is deprecated
|
||||
else {
|
||||
$watch_version = 1;
|
||||
}
|
||||
}
|
||||
if ($watch_version < 3) {
|
||||
uscan_warn
|
||||
"$args->{watchfile} is an obsolete version $watch_version watch file;\n"
|
||||
. " please upgrade to a higher version\n"
|
||||
. " (see uscan(1) for details).";
|
||||
}
|
||||
|
||||
# "version" is fixed, parsing lines now
|
||||
|
||||
# Are there any warnings from this part to give if we're using dehs?
|
||||
dehs_output if ($dehs);
|
||||
|
||||
# Handle shell \\ -> \
|
||||
s/\\\\/\\/g if $watch_version == 1;
|
||||
|
||||
# Handle @PACKAGE@ @ANY_VERSION@ @ARCHIVE_EXT@ substitutions
|
||||
s/\@PACKAGE\@/$args->{package}/g;
|
||||
s/\@ANY_VERSION\@/ANY_VERSION/ge;
|
||||
s/\@ARCHIVE_EXT\@/ARCHIVE_EXT/ge;
|
||||
s/\@SIGNATURE_EXT\@/SIGNATURE_EXT/ge;
|
||||
s/\@DEB_EXT\@/DEB_EXT/ge;
|
||||
|
||||
my $line = Devscripts::Uscan::WatchLine->new({
|
||||
# Shared between lines
|
||||
config => $self->config,
|
||||
downloader => $self->downloader,
|
||||
shared => $self->shared,
|
||||
keyring => $self->keyring,
|
||||
|
||||
# Other parameters
|
||||
line => $_,
|
||||
pkg => $self->package,
|
||||
pkg_dir => $self->pkg_dir,
|
||||
pkg_version => $self->pkg_version,
|
||||
watch_version => $watch_version,
|
||||
watchfile => $self->watchfile,
|
||||
});
|
||||
push @{ $self->group }, $lineNumber
|
||||
if ($line->type and $line->type =~ /^(?:group|checksum)$/);
|
||||
push @{ $self->watchlines }, $line;
|
||||
$lineNumber++;
|
||||
}
|
||||
|
||||
close WATCH
|
||||
or $self->status(1),
|
||||
uscan_warn "problems reading $$args->{watchfile}: $!";
|
||||
$self->watch_version($watch_version);
|
||||
}
|
||||
|
||||
sub process_lines {
|
||||
my ($self) = shift;
|
||||
return $self->process_group if (@{ $self->group });
|
||||
foreach (@{ $self->watchlines }) {
|
||||
|
||||
# search newfile and newversion
|
||||
my $res = $_->process;
|
||||
$self->status($res) if ($res);
|
||||
}
|
||||
return $self->{status};
|
||||
}
|
||||
|
||||
sub process_group {
|
||||
my ($self) = @_;
|
||||
my $saveDconfig = $self->config->download_version;
|
||||
# Build version
|
||||
my @cur_versions = split /\+~/, $self->pkg_version;
|
||||
my $checksum = 0;
|
||||
my $newChecksum = 0;
|
||||
if ( $cur_versions[$#cur_versions]
|
||||
and $cur_versions[$#cur_versions] =~ s/^cs//) {
|
||||
$checksum = pop @cur_versions;
|
||||
}
|
||||
my (@new_versions, @last_debian_mangled_uversions, @last_versions);
|
||||
my $download = 0;
|
||||
my $last_shared = $self->shared;
|
||||
my $last_comp_version;
|
||||
my @dversion;
|
||||
my @ck_versions;
|
||||
# Isolate component and following lines
|
||||
if (my $v = $self->config->download_version) {
|
||||
@dversion = map { s/\+.*$//; /^cs/ ? () : $_ } split /\+~/, $v;
|
||||
}
|
||||
foreach my $line (@{ $self->watchlines }) {
|
||||
if ( $line->type and $line->type eq 'group'
|
||||
or $line->type eq 'checksum') {
|
||||
$last_shared = $self->new_shared;
|
||||
$last_comp_version = shift @cur_versions if $line->type eq 'group';
|
||||
}
|
||||
if ($line->type and $line->type eq 'group') {
|
||||
$line->{groupDversion} = shift @dversion;
|
||||
}
|
||||
$line->shared($last_shared);
|
||||
$line->pkg_version($last_comp_version || 0);
|
||||
}
|
||||
# Check if download is needed
|
||||
foreach my $line (@{ $self->watchlines }) {
|
||||
next unless ($line->type eq 'group' or $line->type eq 'checksum');
|
||||
# Stop on error
|
||||
$self->config->download_version($line->{groupDversion})
|
||||
if $line->{groupDversion};
|
||||
$self->config->download_version(undef) if $line->type eq 'checksum';
|
||||
if ( $line->parse
|
||||
or $line->search
|
||||
or $line->get_upstream_url
|
||||
or $line->get_newfile_base
|
||||
or ($line->type eq 'group' and $line->cmp_versions)
|
||||
or ($line->ctype and $line->cmp_versions)) {
|
||||
$self->{status} += $line->status;
|
||||
return $self->{status};
|
||||
}
|
||||
$download = $line->shared->{download}
|
||||
if $line->shared->{download} > $download
|
||||
and ($line->type eq 'group' or $line->ctype);
|
||||
}
|
||||
foreach my $line (@{ $self->watchlines }) {
|
||||
next unless $line->type eq 'checksum';
|
||||
$newChecksum
|
||||
= $self->sum($newChecksum, $line->search_result->{newversion});
|
||||
push @ck_versions, $line->search_result->{newversion};
|
||||
}
|
||||
foreach my $line (@{ $self->watchlines }) {
|
||||
next unless ($line->type eq 'checksum');
|
||||
$line->parse_result->{mangled_lastversion} = $checksum;
|
||||
my $tmp = $line->search_result->{newversion};
|
||||
$line->search_result->{newversion} = $newChecksum;
|
||||
unless ($line->ctype) {
|
||||
if ($line->cmp_versions) {
|
||||
$self->{status} += $line->status;
|
||||
return $self->{status};
|
||||
}
|
||||
$download = $line->shared->{download}
|
||||
if $line->shared->{download} > $download;
|
||||
}
|
||||
$line->search_result->{newversion} = $tmp;
|
||||
if ($line->component) {
|
||||
pop @{ $dehs_tags->{'component-upstream-version'} };
|
||||
push @{ $dehs_tags->{'component-upstream-version'} }, $tmp;
|
||||
}
|
||||
}
|
||||
foreach my $line (@{ $self->watchlines }) {
|
||||
# Set same $download for all
|
||||
$line->shared->{download} = $download;
|
||||
# Non "group" lines where not initialized
|
||||
unless ($line->type eq 'group' or $line->type eq 'checksum') {
|
||||
if ( $line->parse
|
||||
or $line->search
|
||||
or $line->get_upstream_url
|
||||
or $line->get_newfile_base
|
||||
or $line->cmp_versions) {
|
||||
$self->{status} += $line->status;
|
||||
return $self->{status};
|
||||
}
|
||||
}
|
||||
if ($line->download_file_and_sig) {
|
||||
$self->{status} += $line->status;
|
||||
return $self->{status};
|
||||
}
|
||||
if ($line->mkorigtargz) {
|
||||
$self->{status} += $line->status;
|
||||
return $self->{status};
|
||||
}
|
||||
if ($line->type eq 'group') {
|
||||
push @new_versions, $line->shared->{common_mangled_newversion}
|
||||
|| $line->shared->{common_newversion}
|
||||
|| ();
|
||||
push @last_versions, $line->parse_result->{lastversion};
|
||||
push @last_debian_mangled_uversions,
|
||||
$line->parse_result->{mangled_lastversion};
|
||||
}
|
||||
}
|
||||
my $new_version = join '+~', @new_versions;
|
||||
if ($newChecksum) {
|
||||
$new_version .= "+~cs$newChecksum";
|
||||
}
|
||||
if ($checksum) {
|
||||
push @last_versions, "cs$newChecksum";
|
||||
push @last_debian_mangled_uversions, "cs$checksum";
|
||||
}
|
||||
$dehs_tags->{'upstream-version'} = $new_version;
|
||||
$dehs_tags->{'debian-uversion'} = join('+~', @last_versions)
|
||||
if (grep { $_ } @last_versions);
|
||||
$dehs_tags->{'debian-mangled-uversion'} = join '+~',
|
||||
@last_debian_mangled_uversions
|
||||
if (grep { $_ } @last_debian_mangled_uversions);
|
||||
my $mangled_ver
|
||||
= Dpkg::Version->new(
|
||||
"1:" . $dehs_tags->{'debian-mangled-uversion'} . "-0",
|
||||
check => 0);
|
||||
my $upstream_ver = Dpkg::Version->new("1:$new_version-0", check => 0);
|
||||
if ($mangled_ver == $upstream_ver) {
|
||||
$dehs_tags->{'status'} = "up to date";
|
||||
} elsif ($mangled_ver > $upstream_ver) {
|
||||
$dehs_tags->{'status'} = "only older package available";
|
||||
} else {
|
||||
$dehs_tags->{'status'} = "newer package available";
|
||||
}
|
||||
foreach my $line (@{ $self->watchlines }) {
|
||||
my $path = $line->destfile or next;
|
||||
my $ver = $line->shared->{common_mangled_newversion};
|
||||
$path =~ s/\Q$ver\E/$new_version/;
|
||||
uscan_warn "rename $line->{destfile} to $path\n";
|
||||
rename $line->{destfile}, $path;
|
||||
if ($dehs_tags->{"target-path"} eq $line->{destfile}) {
|
||||
$dehs_tags->{"target-path"} = $path;
|
||||
$dehs_tags->{target} =~ s/\Q$ver\E/$new_version/;
|
||||
} else {
|
||||
for (
|
||||
my $i = 0 ;
|
||||
$i < @{ $dehs_tags->{"component-target-path"} } ;
|
||||
$i++
|
||||
) {
|
||||
if ($dehs_tags->{"component-target-path"}->[$i] eq
|
||||
$line->{destfile}) {
|
||||
$dehs_tags->{"component-target-path"}->[$i] = $path;
|
||||
$dehs_tags->{"component-target"}->[$i]
|
||||
=~ s/\Q$ver\E/$new_version/
|
||||
or die $ver;
|
||||
}
|
||||
}
|
||||
}
|
||||
if ($line->signature_available) {
|
||||
rename "$line->{destfile}.asc", "$path.asc";
|
||||
rename "$line->{destfile}.sig", "$path.sig";
|
||||
}
|
||||
}
|
||||
if (@ck_versions) {
|
||||
my $v = join '+~', @ck_versions;
|
||||
if ($dehs) {
|
||||
$dehs_tags->{'decoded-checksum'} = $v;
|
||||
} else {
|
||||
uscan_verbose 'Checksum ref: ' . join('+~', @ck_versions) . "\n";
|
||||
}
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
sub sum {
|
||||
my ($self, @versions) = @_;
|
||||
my (@res, @str);
|
||||
foreach my $v (@versions) {
|
||||
my @tmp = grep { $_ ne '.' } version_split_digits($v);
|
||||
for (my $i = 0 ; $i < @tmp ; $i++) {
|
||||
$str[$i] //= '';
|
||||
$res[$i] //= 0;
|
||||
if ($tmp[$i] =~ /^\d+$/) {
|
||||
$res[$i] += $tmp[$i];
|
||||
} else {
|
||||
uscan_die
|
||||
"Checksum supports only digits in versions, $tmp[$i] is not accepted";
|
||||
}
|
||||
}
|
||||
}
|
||||
for (my $i = 0 ; $i < @res ; $i++) {
|
||||
my $tmp = shift @str;
|
||||
$res[$i] .= $tmp if $tmp ne '';
|
||||
}
|
||||
push @res, @str;
|
||||
return join '.', @res;
|
||||
}
|
||||
|
||||
1;
|
1876
lib/Devscripts/Uscan/WatchLine.pm
Normal file
1876
lib/Devscripts/Uscan/WatchLine.pm
Normal file
File diff suppressed because it is too large
Load diff
95
lib/Devscripts/Uscan/_vcs.pm
Normal file
95
lib/Devscripts/Uscan/_vcs.pm
Normal file
|
@ -0,0 +1,95 @@
|
|||
# Common sub shared between git and svn
|
||||
package Devscripts::Uscan::_vcs;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Uscan::Utils;
|
||||
use Exporter 'import';
|
||||
use File::Basename;
|
||||
|
||||
our @EXPORT = ('get_refs');
|
||||
|
||||
our $progname = basename($0);
|
||||
|
||||
sub _vcs_newfile_base {
|
||||
my ($self) = @_;
|
||||
# Compression may optionally be deferred to mk-origtargz
|
||||
my $newfile_base = "$self->{pkg}-$self->{search_result}->{newversion}.tar";
|
||||
if (!$self->config->{vcs_export_uncompressed}) {
|
||||
$newfile_base .= '.' . get_suffix($self->compression);
|
||||
}
|
||||
return $newfile_base;
|
||||
}
|
||||
|
||||
sub get_refs {
|
||||
my ($self, $command, $ref_pattern, $package) = @_;
|
||||
my @command = @$command;
|
||||
my ($newfile, $newversion);
|
||||
{
|
||||
local $, = ' ';
|
||||
uscan_verbose "Execute: @command";
|
||||
}
|
||||
open(REFS, "-|", @command)
|
||||
|| uscan_die "$progname: you must have the $package package installed";
|
||||
my @refs;
|
||||
my $ref;
|
||||
my $version;
|
||||
while (<REFS>) {
|
||||
chomp;
|
||||
uscan_debug "$_";
|
||||
if ($_ =~ $ref_pattern) {
|
||||
$ref = $1;
|
||||
foreach my $_pattern (@{ $self->patterns }) {
|
||||
$version = join(".",
|
||||
map { $_ if defined($_) } $ref =~ m&^$_pattern$&);
|
||||
if (
|
||||
mangle(
|
||||
$self->watchfile, \$self->line,
|
||||
'uversionmangle:', \@{ $self->uversionmangle },
|
||||
\$version
|
||||
)
|
||||
) {
|
||||
return undef;
|
||||
}
|
||||
push @refs, [$version, $ref];
|
||||
}
|
||||
}
|
||||
}
|
||||
if (@refs) {
|
||||
@refs = Devscripts::Versort::upstream_versort(@refs);
|
||||
my $msg = "Found the following matching refs:\n";
|
||||
foreach my $ref (@refs) {
|
||||
$msg .= " $$ref[1] ($$ref[0])\n";
|
||||
}
|
||||
uscan_verbose "$msg";
|
||||
if ($self->shared->{download_version}
|
||||
and not $self->versionmode eq 'ignore') {
|
||||
|
||||
# extract ones which has $version in the above loop matched with $download_version
|
||||
my @vrefs
|
||||
= grep { $$_[0] eq $self->shared->{download_version} } @refs;
|
||||
if (@vrefs) {
|
||||
($newversion, $newfile) = @{ $vrefs[0] };
|
||||
} else {
|
||||
uscan_warn
|
||||
"$progname warning: In $self->{watchfile} no matching"
|
||||
. " refs for version "
|
||||
. $self->shared->{download_version}
|
||||
. " in watch line\n "
|
||||
. $self->{line};
|
||||
return undef;
|
||||
}
|
||||
|
||||
} else {
|
||||
($newversion, $newfile) = @{ $refs[0] };
|
||||
}
|
||||
} else {
|
||||
uscan_warn "$progname warning: In $self->{watchfile},\n"
|
||||
. " no matching refs for watch line\n"
|
||||
. " $self->{line}";
|
||||
return undef;
|
||||
}
|
||||
return ($newversion, $newfile);
|
||||
}
|
||||
|
||||
1;
|
90
lib/Devscripts/Uscan/_xtp.pm
Normal file
90
lib/Devscripts/Uscan/_xtp.pm
Normal file
|
@ -0,0 +1,90 @@
|
|||
# Common sub shared between http and ftp
|
||||
package Devscripts::Uscan::_xtp;
|
||||
|
||||
use strict;
|
||||
use File::Basename;
|
||||
use Exporter 'import';
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Uscan::Utils;
|
||||
|
||||
our @EXPORT = ('partial_version');
|
||||
|
||||
sub _xtp_newfile_base {
|
||||
my ($self) = @_;
|
||||
my $newfile_base;
|
||||
if (@{ $self->filenamemangle }) {
|
||||
|
||||
# HTTP or FTP site (with filenamemangle)
|
||||
if ($self->versionless) {
|
||||
$newfile_base = $self->upstream_url;
|
||||
} else {
|
||||
$newfile_base = $self->search_result->{newfile};
|
||||
}
|
||||
my $cmp = $newfile_base;
|
||||
uscan_verbose "Matching target for filenamemangle: $newfile_base";
|
||||
if (
|
||||
mangle(
|
||||
$self->watchfile, \$self->line,
|
||||
'filenamemangle:', \@{ $self->filenamemangle },
|
||||
\$newfile_base
|
||||
)
|
||||
) {
|
||||
$self->status(1);
|
||||
return undef;
|
||||
}
|
||||
if ($newfile_base =~ m/^(?:https?|ftp):/) {
|
||||
$newfile_base = basename($newfile_base);
|
||||
}
|
||||
if ($cmp eq $newfile_base) {
|
||||
uscan_die "filenamemangle failed for $cmp";
|
||||
}
|
||||
unless ($self->search_result->{newversion}) {
|
||||
|
||||
# uversionmanglesd version is '', make best effort to set it
|
||||
$newfile_base
|
||||
=~ m/^.+?[-_]?(\d[\-+\.:\~\da-zA-Z]*)(?:\.tar\.(gz|bz2|xz|zstd?)|\.zip)$/i;
|
||||
$self->search_result->{newversion} = $1;
|
||||
unless ($self->search_result->{newversion}) {
|
||||
uscan_warn
|
||||
"Fix filenamemangle to produce a filename with the correct version";
|
||||
$self->status(1);
|
||||
return undef;
|
||||
}
|
||||
uscan_verbose
|
||||
"Newest upstream tarball version from the filenamemangled filename: $self->{search_result}->{newversion}";
|
||||
}
|
||||
} else {
|
||||
# HTTP or FTP site (without filenamemangle)
|
||||
$newfile_base = basename($self->search_result->{newfile});
|
||||
if ($self->mode eq 'http') {
|
||||
|
||||
# Remove HTTP header trash
|
||||
$newfile_base =~ s/[\?#].*$//; # PiPy
|
||||
# just in case this leaves us with nothing
|
||||
if ($newfile_base eq '') {
|
||||
uscan_warn
|
||||
"No good upstream filename found after removing tailing ?... and #....\n Use filenamemangle to fix this.";
|
||||
$self->status(1);
|
||||
return undef;
|
||||
}
|
||||
}
|
||||
}
|
||||
return $newfile_base;
|
||||
}
|
||||
|
||||
sub partial_version {
|
||||
my ($download_version) = @_;
|
||||
my ($d1, $d2, $d3);
|
||||
if (defined $download_version) {
|
||||
uscan_verbose "download version requested: $download_version";
|
||||
if ($download_version
|
||||
=~ m/^([-~\+\w]+)(\.[-~\+\w]+)?(\.[-~\+\w]+)?(\.[-~\+\w]+)?$/) {
|
||||
$d1 = "$1" if defined $1;
|
||||
$d2 = "$1$2" if defined $2;
|
||||
$d3 = "$1$2$3" if defined $3;
|
||||
}
|
||||
}
|
||||
return ($d1, $d2, $d3);
|
||||
}
|
||||
|
||||
1;
|
280
lib/Devscripts/Uscan/ftp.pm
Normal file
280
lib/Devscripts/Uscan/ftp.pm
Normal file
|
@ -0,0 +1,280 @@
|
|||
package Devscripts::Uscan::ftp;
|
||||
|
||||
use strict;
|
||||
use Cwd qw/abs_path/;
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Uscan::Utils;
|
||||
use Devscripts::Uscan::_xtp;
|
||||
use Moo::Role;
|
||||
|
||||
#######################################################################
|
||||
# search $newfile $newversion (ftp mode)
|
||||
#######################################################################
|
||||
sub ftp_search {
|
||||
my ($self) = @_;
|
||||
|
||||
# FTP site
|
||||
uscan_verbose "Requesting URL:\n $self->{parse_result}->{base}";
|
||||
my $request = HTTP::Request->new('GET', $self->parse_result->{base});
|
||||
my $response = $self->downloader->user_agent->request($request);
|
||||
if (!$response->is_success) {
|
||||
uscan_warn
|
||||
"In watch file $self->{watchfile}, reading FTP directory\n $self->{parse_result}->{base} failed: "
|
||||
. $response->status_line . "";
|
||||
return undef;
|
||||
}
|
||||
|
||||
my $content = $response->content;
|
||||
uscan_extra_debug
|
||||
"received content:\n$content\n[End of received content] by FTP";
|
||||
|
||||
# FTP directory listings either look like:
|
||||
# info info ... info filename [ -> linkname]
|
||||
# or they're HTMLised (if they've been through an HTTP proxy)
|
||||
# so we may have to look for <a href="filename"> type patterns
|
||||
uscan_verbose "matching pattern $self->{parse_result}->{pattern}";
|
||||
my (@files);
|
||||
|
||||
# We separate out HTMLised listings from standard listings, so
|
||||
# that we can target our search correctly
|
||||
if ($content =~ /<\s*a\s+[^>]*href/i) {
|
||||
uscan_verbose "HTMLized FTP listing by the HTTP proxy";
|
||||
while ($content
|
||||
=~ m/(?:<\s*a\s+[^>]*href\s*=\s*\")((?-i)$self->{parse_result}->{pattern})\"/gi
|
||||
) {
|
||||
my $file = fix_href($1);
|
||||
my $mangled_version
|
||||
= join(".", $file =~ m/^$self->{parse_result}->{pattern}$/);
|
||||
if (
|
||||
mangle(
|
||||
$self->watchfile, \$self->line,
|
||||
'uversionmangle:', \@{ $self->uversionmangle },
|
||||
\$mangled_version
|
||||
)
|
||||
) {
|
||||
return undef;
|
||||
}
|
||||
my $match = '';
|
||||
if (defined $self->shared->{download_version}
|
||||
and not $self->versionmode eq 'ignore') {
|
||||
if ($mangled_version eq $self->shared->{download_version}) {
|
||||
$match = "matched with the download version";
|
||||
}
|
||||
}
|
||||
my $priority = $mangled_version . '-' . get_priority($file);
|
||||
push @files, [$priority, $mangled_version, $file, $match];
|
||||
}
|
||||
} else {
|
||||
uscan_verbose "Standard FTP listing.";
|
||||
|
||||
# they all look like:
|
||||
# info info ... info filename [ -> linkname]
|
||||
for my $ln (split(/\n/, $content)) {
|
||||
$ln =~ s/^d.*$//; # FTP listing of directory, '' skipped
|
||||
$ln =~ s/\s+->\s+\S+$//; # FTP listing for link destination
|
||||
$ln =~ s/^.*\s(\S+)$/$1/; # filename only
|
||||
if ($ln and $ln =~ m/^($self->{parse_result}->{filepattern})$/) {
|
||||
my $file = $1;
|
||||
my $mangled_version = join(".",
|
||||
$file =~ m/^$self->{parse_result}->{filepattern}$/);
|
||||
if (
|
||||
mangle(
|
||||
$self->watchfile, \$self->line,
|
||||
'uversionmangle:', \@{ $self->uversionmangle },
|
||||
\$mangled_version
|
||||
)
|
||||
) {
|
||||
return undef;
|
||||
}
|
||||
my $match = '';
|
||||
if (defined $self->shared->{download_version}) {
|
||||
if ($mangled_version eq $self->shared->{download_version})
|
||||
{
|
||||
$match = "matched with the download version";
|
||||
}
|
||||
}
|
||||
my $priority = $mangled_version . '-' . get_priority($file);
|
||||
push @files, [$priority, $mangled_version, $file, $match];
|
||||
}
|
||||
}
|
||||
}
|
||||
if (@files) {
|
||||
@files = Devscripts::Versort::versort(@files);
|
||||
my $msg
|
||||
= "Found the following matching files on the web page (newest first):\n";
|
||||
foreach my $file (@files) {
|
||||
$msg .= " $$file[2] ($$file[1]) index=$$file[0] $$file[3]\n";
|
||||
}
|
||||
uscan_verbose $msg;
|
||||
}
|
||||
my ($newversion, $newfile);
|
||||
if (defined $self->shared->{download_version}) {
|
||||
|
||||
# extract ones which has $match in the above loop defined
|
||||
my @vfiles = grep { $$_[3] } @files;
|
||||
if (@vfiles) {
|
||||
(undef, $newversion, $newfile, undef) = @{ $vfiles[0] };
|
||||
} else {
|
||||
uscan_warn
|
||||
"In $self->{watchfile} no matching files for version $self->{shared}->{download_version}"
|
||||
. " in watch line\n $self->{line}";
|
||||
return undef;
|
||||
}
|
||||
} else {
|
||||
if (@files) {
|
||||
(undef, $newversion, $newfile, undef) = @{ $files[0] };
|
||||
} else {
|
||||
uscan_warn
|
||||
"In $self->{watchfile} no matching files for watch line\n $self->{line}";
|
||||
return undef;
|
||||
}
|
||||
}
|
||||
return ($newversion, $newfile);
|
||||
}
|
||||
|
||||
sub ftp_upstream_url {
|
||||
my ($self) = @_;
|
||||
return $self->parse_result->{base} . $self->search_result->{newfile};
|
||||
}
|
||||
|
||||
*ftp_newfile_base = \&Devscripts::Uscan::_xtp::_xtp_newfile_base;
|
||||
|
||||
sub ftp_newdir {
|
||||
my ($line, $site, $dir, $pattern, $dirversionmangle, $watchfile,
|
||||
$lineptr, $download_version)
|
||||
= @_;
|
||||
my $downloader = $line->downloader;
|
||||
|
||||
my ($request, $response, $newdir);
|
||||
my ($download_version_short1, $download_version_short2,
|
||||
$download_version_short3)
|
||||
= partial_version($download_version);
|
||||
my $base = $site . $dir;
|
||||
$request = HTTP::Request->new('GET', $base);
|
||||
$response = $downloader->user_agent->request($request);
|
||||
if (!$response->is_success) {
|
||||
uscan_warn
|
||||
"In watch file $watchfile, reading webpage\n $base failed: "
|
||||
. $response->status_line;
|
||||
return '';
|
||||
}
|
||||
|
||||
my $content = $response->content;
|
||||
uscan_extra_debug
|
||||
"received content:\n$content\n[End of received content] by FTP";
|
||||
|
||||
# FTP directory listings either look like:
|
||||
# info info ... info filename [ -> linkname]
|
||||
# or they're HTMLised (if they've been through an HTTP proxy)
|
||||
# so we may have to look for <a href="filename"> type patterns
|
||||
uscan_verbose "matching pattern $pattern";
|
||||
my (@dirs);
|
||||
my $match = '';
|
||||
|
||||
# We separate out HTMLised listings from standard listings, so
|
||||
# that we can target our search correctly
|
||||
if ($content =~ /<\s*a\s+[^>]*href/i) {
|
||||
uscan_verbose "HTMLized FTP listing by the HTTP proxy";
|
||||
while (
|
||||
$content =~ m/(?:<\s*a\s+[^>]*href\s*=\s*\")((?-i)$pattern)\"/gi) {
|
||||
my $dir = $1;
|
||||
uscan_verbose "Matching target for dirversionmangle: $dir";
|
||||
my $mangled_version = join(".", $dir =~ m/^$pattern$/);
|
||||
if (
|
||||
mangle(
|
||||
$watchfile, $lineptr,
|
||||
'dirversionmangle:', \@{$dirversionmangle},
|
||||
\$mangled_version
|
||||
)
|
||||
) {
|
||||
return 1;
|
||||
}
|
||||
$match = '';
|
||||
if (defined $download_version
|
||||
and $mangled_version eq $download_version) {
|
||||
$match = "matched with the download version";
|
||||
}
|
||||
if (defined $download_version_short3
|
||||
and $mangled_version eq $download_version_short3) {
|
||||
$match = "matched with the download version (partial 3)";
|
||||
}
|
||||
if (defined $download_version_short2
|
||||
and $mangled_version eq $download_version_short2) {
|
||||
$match = "matched with the download version (partial 2)";
|
||||
}
|
||||
if (defined $download_version_short1
|
||||
and $mangled_version eq $download_version_short1) {
|
||||
$match = "matched with the download version (partial 1)";
|
||||
}
|
||||
push @dirs, [$mangled_version, $dir, $match];
|
||||
}
|
||||
} else {
|
||||
# they all look like:
|
||||
# info info ... info filename [ -> linkname]
|
||||
uscan_verbose "Standard FTP listing.";
|
||||
foreach my $ln (split(/\n/, $content)) {
|
||||
$ln =~ s/^-.*$//; # FTP listing of file, '' skipped
|
||||
$ln =~ s/\s+->\s+\S+$//; # FTP listing for link destination
|
||||
$ln =~ s/^.*\s(\S+)$/$1/; # filename only
|
||||
if ($ln =~ m/^($pattern)(\s+->\s+\S+)?$/) {
|
||||
my $dir = $1;
|
||||
uscan_verbose "Matching target for dirversionmangle: $dir";
|
||||
my $mangled_version = join(".", $dir =~ m/^$pattern$/);
|
||||
if (
|
||||
mangle(
|
||||
$watchfile, $lineptr,
|
||||
'dirversionmangle:', \@{$dirversionmangle},
|
||||
\$mangled_version
|
||||
)
|
||||
) {
|
||||
return 1;
|
||||
}
|
||||
$match = '';
|
||||
if (defined $download_version
|
||||
and $mangled_version eq $download_version) {
|
||||
$match = "matched with the download version";
|
||||
}
|
||||
if (defined $download_version_short3
|
||||
and $mangled_version eq $download_version_short3) {
|
||||
$match = "matched with the download version (partial 3)";
|
||||
}
|
||||
if (defined $download_version_short2
|
||||
and $mangled_version eq $download_version_short2) {
|
||||
$match = "matched with the download version (partial 2)";
|
||||
}
|
||||
if (defined $download_version_short1
|
||||
and $mangled_version eq $download_version_short1) {
|
||||
$match = "matched with the download version (partial 1)";
|
||||
}
|
||||
push @dirs, [$mangled_version, $dir, $match];
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
# extract ones which has $match in the above loop defined
|
||||
my @vdirs = grep { $$_[2] } @dirs;
|
||||
if (@vdirs) {
|
||||
@vdirs = Devscripts::Versort::upstream_versort(@vdirs);
|
||||
$newdir = $vdirs[0][1];
|
||||
}
|
||||
if (@dirs) {
|
||||
@dirs = Devscripts::Versort::upstream_versort(@dirs);
|
||||
my $msg
|
||||
= "Found the following matching FTP directories (newest first):\n";
|
||||
foreach my $dir (@dirs) {
|
||||
$msg .= " $$dir[1] ($$dir[0]) $$dir[2]\n";
|
||||
}
|
||||
uscan_verbose $msg;
|
||||
$newdir //= $dirs[0][1];
|
||||
} else {
|
||||
uscan_warn
|
||||
"In $watchfile no matching dirs for pattern\n $base$pattern";
|
||||
$newdir = '';
|
||||
}
|
||||
return $newdir;
|
||||
}
|
||||
|
||||
# Nothing to clean here
|
||||
sub ftp_clean { 0 }
|
||||
|
||||
1;
|
192
lib/Devscripts/Uscan/git.pm
Normal file
192
lib/Devscripts/Uscan/git.pm
Normal file
|
@ -0,0 +1,192 @@
|
|||
package Devscripts::Uscan::git;
|
||||
|
||||
use strict;
|
||||
use Cwd qw/abs_path/;
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Uscan::Utils;
|
||||
use Devscripts::Uscan::_vcs;
|
||||
use Dpkg::IPC;
|
||||
use File::Path 'remove_tree';
|
||||
use Moo::Role;
|
||||
|
||||
######################################################
|
||||
# search $newfile $newversion (git mode/versionless)
|
||||
######################################################
|
||||
sub git_search {
|
||||
my ($self) = @_;
|
||||
my ($newfile, $newversion);
|
||||
if ($self->versionless) {
|
||||
$newfile = $self->parse_result->{filepattern}; # HEAD or heads/<branch>
|
||||
if ($self->pretty eq 'describe') {
|
||||
$self->git->{mode} = 'full';
|
||||
}
|
||||
if ( $self->git->{mode} eq 'shallow'
|
||||
and $self->parse_result->{filepattern} eq 'HEAD') {
|
||||
uscan_exec(
|
||||
'git',
|
||||
'clone',
|
||||
'--quiet',
|
||||
'--bare',
|
||||
'--depth=1',
|
||||
$self->parse_result->{base},
|
||||
"$self->{downloader}->{destdir}/" . $self->gitrepo_dir
|
||||
);
|
||||
$self->downloader->gitrepo_state(1);
|
||||
} elsif ($self->git->{mode} eq 'shallow'
|
||||
and $self->parse_result->{filepattern} ne 'HEAD')
|
||||
{ # heads/<branch>
|
||||
$newfile =~ s&^heads/&&; # Set to <branch>
|
||||
uscan_exec(
|
||||
'git',
|
||||
'clone',
|
||||
'--quiet',
|
||||
'--bare',
|
||||
'--depth=1',
|
||||
'-b',
|
||||
"$newfile",
|
||||
$self->parse_result->{base},
|
||||
"$self->{downloader}->{destdir}/" . $self->gitrepo_dir
|
||||
);
|
||||
$self->downloader->gitrepo_state(1);
|
||||
} else {
|
||||
uscan_exec(
|
||||
'git', 'clone', '--quiet', '--bare',
|
||||
$self->parse_result->{base},
|
||||
"$self->{downloader}->{destdir}/" . $self->gitrepo_dir
|
||||
);
|
||||
$self->downloader->gitrepo_state(2);
|
||||
}
|
||||
if ($self->pretty eq 'describe') {
|
||||
|
||||
# use unannotated tags to be on safe side
|
||||
spawn(
|
||||
exec => [
|
||||
'git',
|
||||
"--git-dir=$self->{downloader}->{destdir}/$self->{gitrepo_dir}",
|
||||
'describe',
|
||||
'--tags'
|
||||
],
|
||||
wait_child => 1,
|
||||
to_string => \$newversion
|
||||
);
|
||||
$newversion =~ s/-/./g;
|
||||
chomp($newversion);
|
||||
if (
|
||||
mangle(
|
||||
$self->watchfile, \$self->line,
|
||||
'uversionmangle:', \@{ $self->uversionmangle },
|
||||
\$newversion
|
||||
)
|
||||
) {
|
||||
return undef;
|
||||
}
|
||||
} else {
|
||||
my $tmp = $ENV{TZ};
|
||||
$ENV{TZ} = 'UTC';
|
||||
$newfile
|
||||
= $self->parse_result->{filepattern}; # HEAD or heads/<branch>
|
||||
if ($self->parse_result->{filepattern} eq 'HEAD') {
|
||||
spawn(
|
||||
exec => [
|
||||
'git',
|
||||
"--git-dir=$self->{downloader}->{destdir}/$self->{gitrepo_dir}",
|
||||
'log',
|
||||
'-1',
|
||||
"--date=format-local:$self->{date}",
|
||||
"--pretty=$self->{pretty}"
|
||||
],
|
||||
wait_child => 1,
|
||||
to_string => \$newversion
|
||||
);
|
||||
} else {
|
||||
$newfile =~ s&^heads/&&; # Set to <branch>
|
||||
spawn(
|
||||
exec => [
|
||||
'git',
|
||||
"--git-dir=$self->{downloader}->{destdir}/$self->{gitrepo_dir}",
|
||||
'log',
|
||||
'-1',
|
||||
'-b',
|
||||
"$newfile",
|
||||
"--date=format-local:$self->{date}",
|
||||
"--pretty=$self->{pretty}"
|
||||
],
|
||||
wait_child => 1,
|
||||
to_string => \$newversion
|
||||
);
|
||||
}
|
||||
$ENV{TZ} = $tmp;
|
||||
chomp($newversion);
|
||||
}
|
||||
}
|
||||
################################################
|
||||
# search $newfile $newversion (git mode w/tag)
|
||||
################################################
|
||||
elsif ($self->mode eq 'git') {
|
||||
my @args = ('ls-remote', $self->parse_result->{base});
|
||||
# Try to use local upstream branch if available
|
||||
if (-d '.git') {
|
||||
my $out;
|
||||
eval {
|
||||
spawn(
|
||||
exec => ['git', 'remote', '--verbose', 'show'],
|
||||
wait_child => 1,
|
||||
to_string => \$out
|
||||
);
|
||||
};
|
||||
# Check if git repo found in debian/watch exists in
|
||||
# `git remote show` output
|
||||
if ($out and $out =~ /^(\S+)\s+\Q$self->{parse_result}->{base}\E/m)
|
||||
{
|
||||
$self->downloader->git_upstream($1);
|
||||
uscan_warn
|
||||
"Using $self->{downloader}->{git_upstream} remote origin";
|
||||
# Found, launch a "fetch" to be up to date
|
||||
spawn(
|
||||
exec => ['git', 'fetch', $self->downloader->git_upstream],
|
||||
wait_child => 1
|
||||
);
|
||||
@args = ('show-ref');
|
||||
}
|
||||
}
|
||||
($newversion, $newfile)
|
||||
= get_refs($self, ['git', @args], qr/^\S+\s+([^\^\{\}]+)$/, 'git');
|
||||
return undef if !defined $newversion;
|
||||
}
|
||||
return ($newversion, $newfile);
|
||||
}
|
||||
|
||||
sub git_upstream_url {
|
||||
my ($self) = @_;
|
||||
my $upstream_url
|
||||
= $self->parse_result->{base} . ' ' . $self->search_result->{newfile};
|
||||
return $upstream_url;
|
||||
}
|
||||
|
||||
*git_newfile_base = \&Devscripts::Uscan::_vcs::_vcs_newfile_base;
|
||||
|
||||
sub git_clean {
|
||||
my ($self) = @_;
|
||||
|
||||
# If git cloned repo exists and not --debug ($verbose=2) -> remove it
|
||||
if ( $self->downloader->gitrepo_state > 0
|
||||
and $verbose < 2
|
||||
and !$self->downloader->git_upstream) {
|
||||
my $err;
|
||||
uscan_verbose "Removing git repo ($self->{downloader}->{destdir}/"
|
||||
. $self->gitrepo_dir . ")";
|
||||
remove_tree "$self->{downloader}->{destdir}/" . $self->gitrepo_dir,
|
||||
{ error => \$err };
|
||||
if (@$err) {
|
||||
local $, = "\n\t";
|
||||
uscan_warn "Errors during git repo clean:\n\t@$err";
|
||||
}
|
||||
$self->downloader->gitrepo_state(0);
|
||||
} else {
|
||||
uscan_debug "Keep git repo ($self->{downloader}->{destdir}/"
|
||||
. $self->gitrepo_dir . ")";
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
1;
|
510
lib/Devscripts/Uscan/http.pm
Normal file
510
lib/Devscripts/Uscan/http.pm
Normal file
|
@ -0,0 +1,510 @@
|
|||
package Devscripts::Uscan::http;
|
||||
|
||||
use strict;
|
||||
use Cwd qw/abs_path/;
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Uscan::Utils;
|
||||
use Devscripts::Uscan::_xtp;
|
||||
use Moo::Role;
|
||||
|
||||
*http_newfile_base = \&Devscripts::Uscan::_xtp::_xtp_newfile_base;
|
||||
|
||||
##################################
|
||||
# search $newversion (http mode)
|
||||
##################################
|
||||
|
||||
#returns (\@patterns, \@base_sites, \@base_dirs)
|
||||
sub handle_redirection {
|
||||
my ($self, $pattern, @additional_bases) = @_;
|
||||
my @redirections = @{ $self->downloader->user_agent->get_redirections };
|
||||
my (@patterns, @base_sites, @base_dirs);
|
||||
|
||||
uscan_verbose "redirections: @redirections" if @redirections;
|
||||
|
||||
foreach my $_redir (@redirections, @additional_bases) {
|
||||
my $base_dir = $_redir;
|
||||
|
||||
$base_dir =~ s%^\w+://[^/]+/%/%;
|
||||
$base_dir =~ s%/[^/]*(?:[#?].*)?$%/%;
|
||||
if ($_redir =~ m%^(\w+://[^/]+)%) {
|
||||
my $base_site = $1;
|
||||
|
||||
push @patterns,
|
||||
quotemeta($base_site) . quotemeta($base_dir) . "$pattern";
|
||||
push @base_sites, $base_site;
|
||||
push @base_dirs, $base_dir;
|
||||
|
||||
# remove the filename, if any
|
||||
my $base_dir_orig = $base_dir;
|
||||
$base_dir =~ s%/[^/]*$%/%;
|
||||
if ($base_dir ne $base_dir_orig) {
|
||||
push @patterns,
|
||||
quotemeta($base_site) . quotemeta($base_dir) . "$pattern";
|
||||
push @base_sites, $base_site;
|
||||
push @base_dirs, $base_dir;
|
||||
}
|
||||
}
|
||||
}
|
||||
return (\@patterns, \@base_sites, \@base_dirs);
|
||||
}
|
||||
|
||||
sub http_search {
|
||||
my ($self) = @_;
|
||||
|
||||
# $content: web page to be scraped to find the URLs to be downloaded
|
||||
if ($self->{parse_result}->{base} =~ /^https/ and !$self->downloader->ssl)
|
||||
{
|
||||
uscan_die
|
||||
"you must have the liblwp-protocol-https-perl package installed\nto use https URLs";
|
||||
}
|
||||
uscan_verbose "Requesting URL:\n $self->{parse_result}->{base}";
|
||||
my $request = HTTP::Request->new('GET', $self->parse_result->{base});
|
||||
foreach my $k (keys %{ $self->downloader->headers }) {
|
||||
if ($k =~ /^(.*?)@(.*)$/) {
|
||||
my $baseUrl = $1;
|
||||
my $hdr = $2;
|
||||
if ($self->parse_result->{base} =~ m#^\Q$baseUrl\E(?:/.*)?$#) {
|
||||
$request->header($hdr => $self->headers->{$k});
|
||||
uscan_verbose "Set per-host custom header $hdr for "
|
||||
. $self->parse_result->{base};
|
||||
} else {
|
||||
uscan_debug
|
||||
"$self->parse_result->{base} does not start with $1";
|
||||
}
|
||||
} else {
|
||||
uscan_warn "Malformed http-header: $k";
|
||||
}
|
||||
}
|
||||
$request->header('Accept-Encoding' => 'gzip');
|
||||
$request->header('Accept' => '*/*');
|
||||
my $response = $self->downloader->user_agent->request($request);
|
||||
if (!$response->is_success) {
|
||||
uscan_warn
|
||||
"In watchfile $self->{watchfile}, reading webpage\n $self->{parse_result}->{base} failed: "
|
||||
. $response->status_line;
|
||||
return undef;
|
||||
}
|
||||
|
||||
my ($patterns, $base_sites, $base_dirs)
|
||||
= handle_redirection($self, $self->{parse_result}->{filepattern});
|
||||
push @{ $self->patterns }, @$patterns;
|
||||
push @{ $self->sites }, @$base_sites;
|
||||
push @{ $self->basedirs }, @$base_dirs;
|
||||
|
||||
my $content = $response->decoded_content;
|
||||
uscan_extra_debug
|
||||
"received content:\n$content\n[End of received content] by HTTP";
|
||||
|
||||
my @hrefs;
|
||||
if (!$self->searchmode or $self->searchmode eq 'html') {
|
||||
@hrefs = $self->html_search($content, $self->patterns);
|
||||
} elsif ($self->searchmode eq 'plain') {
|
||||
@hrefs = $self->plain_search($content);
|
||||
} else {
|
||||
uscan_warn 'Unknown searchmode "' . $self->searchmode . '", skipping';
|
||||
return undef;
|
||||
}
|
||||
|
||||
if (@hrefs) {
|
||||
@hrefs = Devscripts::Versort::versort(@hrefs);
|
||||
my $msg
|
||||
= "Found the following matching hrefs on the web page (newest first):\n";
|
||||
foreach my $href (@hrefs) {
|
||||
$msg .= " $$href[2] ($$href[1]) index=$$href[0] $$href[3]\n";
|
||||
}
|
||||
uscan_verbose $msg;
|
||||
}
|
||||
my ($newversion, $newfile);
|
||||
if (defined $self->shared->{download_version}
|
||||
and not $self->versionmode eq 'ignore') {
|
||||
|
||||
# extract ones which has $match in the above loop defined
|
||||
my @vhrefs = grep { $$_[3] } @hrefs;
|
||||
if (@vhrefs) {
|
||||
(undef, $newversion, $newfile, undef) = @{ $vhrefs[0] };
|
||||
} else {
|
||||
uscan_warn
|
||||
"In $self->{watchfile} no matching hrefs for version $self->{shared}->{download_version}"
|
||||
. " in watch line\n $self->{line}";
|
||||
return undef;
|
||||
}
|
||||
} else {
|
||||
if (@hrefs) {
|
||||
(undef, $newversion, $newfile, undef) = @{ $hrefs[0] };
|
||||
} else {
|
||||
uscan_warn
|
||||
"In $self->{watchfile} no matching files for watch line\n $self->{line}";
|
||||
return undef;
|
||||
}
|
||||
}
|
||||
return ($newversion, $newfile);
|
||||
}
|
||||
|
||||
#######################################################################
|
||||
# determine $upstream_url (http mode)
|
||||
#######################################################################
|
||||
# http is complicated due to absolute/relative URL issue
|
||||
sub http_upstream_url {
|
||||
my ($self) = @_;
|
||||
my $upstream_url;
|
||||
my $newfile = $self->search_result->{newfile};
|
||||
if ($newfile =~ m%^\w+://%) {
|
||||
$upstream_url = $newfile;
|
||||
} elsif ($newfile =~ m%^//%) {
|
||||
$upstream_url = $self->parse_result->{site};
|
||||
$upstream_url =~ s/^(https?:).*/$1/;
|
||||
$upstream_url .= $newfile;
|
||||
} elsif ($newfile =~ m%^/%) {
|
||||
|
||||
# absolute filename
|
||||
# Were there any redirections? If so try using those first
|
||||
if ($#{ $self->patterns } > 0) {
|
||||
|
||||
# replace $site here with the one we were redirected to
|
||||
foreach my $index (0 .. $#{ $self->patterns }) {
|
||||
if ("$self->{sites}->[$index]$newfile"
|
||||
=~ m&^$self->{patterns}->[$index]$&) {
|
||||
$upstream_url = "$self->{sites}->[$index]$newfile";
|
||||
last;
|
||||
}
|
||||
}
|
||||
if (!defined($upstream_url)) {
|
||||
uscan_verbose
|
||||
"Unable to determine upstream url from redirections,\n"
|
||||
. "defaulting to using site specified in watch file";
|
||||
$upstream_url = "$self->{sites}->[0]$newfile";
|
||||
}
|
||||
} else {
|
||||
$upstream_url = "$self->{sites}->[0]$newfile";
|
||||
}
|
||||
} else {
|
||||
# relative filename, we hope
|
||||
# Were there any redirections? If so try using those first
|
||||
if ($#{ $self->patterns } > 0) {
|
||||
|
||||
# replace $site here with the one we were redirected to
|
||||
foreach my $index (0 .. $#{ $self->patterns }) {
|
||||
|
||||
# skip unless the basedir looks like a directory
|
||||
next unless $self->{basedirs}->[$index] =~ m%/$%;
|
||||
my $nf = "$self->{basedirs}->[$index]$newfile";
|
||||
if ("$self->{sites}->[$index]$nf"
|
||||
=~ m&^$self->{patterns}->[$index]$&) {
|
||||
$upstream_url = "$self->{sites}->[$index]$nf";
|
||||
last;
|
||||
}
|
||||
}
|
||||
if (!defined($upstream_url)) {
|
||||
uscan_verbose
|
||||
"Unable to determine upstream url from redirections,\n"
|
||||
. "defaulting to using site specified in watch file";
|
||||
$upstream_url = "$self->{parse_result}->{urlbase}$newfile";
|
||||
}
|
||||
} else {
|
||||
$upstream_url = "$self->{parse_result}->{urlbase}$newfile";
|
||||
}
|
||||
}
|
||||
|
||||
# mangle if necessary
|
||||
$upstream_url =~ s/&/&/g;
|
||||
uscan_verbose "Matching target for downloadurlmangle: $upstream_url";
|
||||
if (@{ $self->downloadurlmangle }) {
|
||||
if (
|
||||
mangle(
|
||||
$self->watchfile, \$self->line,
|
||||
'downloadurlmangle:', \@{ $self->downloadurlmangle },
|
||||
\$upstream_url
|
||||
)
|
||||
) {
|
||||
$self->status(1);
|
||||
return undef;
|
||||
}
|
||||
}
|
||||
return $upstream_url;
|
||||
}
|
||||
|
||||
sub http_newdir {
|
||||
my ($https, $line, $site, $dir, $pattern, $dirversionmangle,
|
||||
$watchfile, $lineptr, $download_version)
|
||||
= @_;
|
||||
|
||||
my $downloader = $line->downloader;
|
||||
my ($request, $response, $newdir);
|
||||
my ($download_version_short1, $download_version_short2,
|
||||
$download_version_short3)
|
||||
= partial_version($download_version);
|
||||
my $base = $site . $dir;
|
||||
|
||||
$pattern .= "/?";
|
||||
|
||||
if (defined($https) and !$downloader->ssl) {
|
||||
uscan_die
|
||||
"$progname: you must have the liblwp-protocol-https-perl package installed\n"
|
||||
. "to use https URLs";
|
||||
}
|
||||
# At least for now, set base in the line object - other methods need it
|
||||
local $line->parse_result->{base} = $base;
|
||||
$request = HTTP::Request->new('GET', $base);
|
||||
$response = $downloader->user_agent->request($request);
|
||||
if (!$response->is_success) {
|
||||
uscan_warn
|
||||
"In watch file $watchfile, reading webpage\n $base failed: "
|
||||
. $response->status_line;
|
||||
return '';
|
||||
}
|
||||
|
||||
my $content = $response->content;
|
||||
if ( $response->header('Content-Encoding')
|
||||
and $response->header('Content-Encoding') =~ /^gzip$/i) {
|
||||
require IO::Uncompress::Gunzip;
|
||||
require IO::String;
|
||||
uscan_debug "content seems gzip encoded, let's decode it";
|
||||
my $out;
|
||||
if (IO::Uncompress::Gunzip::gunzip(IO::String->new($content), \$out)) {
|
||||
$content = $out;
|
||||
} else {
|
||||
uscan_warn 'Unable to decode remote content: '
|
||||
. $IO::Uncompress::GunzipError;
|
||||
return '';
|
||||
}
|
||||
}
|
||||
uscan_extra_debug
|
||||
"received content:\n$content\n[End of received content] by HTTP";
|
||||
|
||||
clean_content(\$content);
|
||||
|
||||
my ($dirpatterns, $base_sites, $base_dirs)
|
||||
= handle_redirection($line, $pattern, $base);
|
||||
$downloader->user_agent->clear_redirections; # we won't be needing that
|
||||
|
||||
my @hrefs;
|
||||
for my $parsed (
|
||||
html_search($line, $content, $dirpatterns, 'dirversionmangle')) {
|
||||
my ($priority, $mangled_version, $href, $match) = @$parsed;
|
||||
$match = '';
|
||||
if (defined $download_version
|
||||
and $mangled_version eq $download_version) {
|
||||
$match = "matched with the download version";
|
||||
}
|
||||
if (defined $download_version_short3
|
||||
and $mangled_version eq $download_version_short3) {
|
||||
$match = "matched with the download version (partial 3)";
|
||||
}
|
||||
if (defined $download_version_short2
|
||||
and $mangled_version eq $download_version_short2) {
|
||||
$match = "matched with the download version (partial 2)";
|
||||
}
|
||||
if (defined $download_version_short1
|
||||
and $mangled_version eq $download_version_short1) {
|
||||
$match = "matched with the download version (partial 1)";
|
||||
}
|
||||
push @hrefs, [$mangled_version, $href, $match];
|
||||
}
|
||||
|
||||
# extract ones which has $match in the above loop defined
|
||||
my @vhrefs = grep { $$_[2] } @hrefs;
|
||||
if (@vhrefs) {
|
||||
@vhrefs = Devscripts::Versort::upstream_versort(@vhrefs);
|
||||
$newdir = $vhrefs[0][1];
|
||||
}
|
||||
if (@hrefs) {
|
||||
@hrefs = Devscripts::Versort::upstream_versort(@hrefs);
|
||||
my $msg = "Found the following matching directories (newest first):\n";
|
||||
foreach my $href (@hrefs) {
|
||||
$msg .= " $$href[1] ($$href[0]) $$href[2]\n";
|
||||
}
|
||||
uscan_verbose $msg;
|
||||
$newdir //= $hrefs[0][1];
|
||||
} else {
|
||||
uscan_warn
|
||||
"In $watchfile,\n no matching hrefs for pattern\n $site$dir$pattern";
|
||||
return '';
|
||||
}
|
||||
|
||||
# just give the final directory component
|
||||
$newdir =~ s%/$%%;
|
||||
$newdir =~ s%^.*/%%;
|
||||
return ($newdir);
|
||||
}
|
||||
|
||||
# Nothing to clean here
|
||||
sub http_clean { 0 }
|
||||
|
||||
sub clean_content {
|
||||
my ($content) = @_;
|
||||
|
||||
# We need this horrid stuff to handle href=foo type
|
||||
# links. OK, bad HTML, but we have to handle it nonetheless.
|
||||
# It's bug #89749.
|
||||
$$content =~ s/href\s*=\s*(?=[^\"\'])([^\s>]+)/href="$1"/ig;
|
||||
|
||||
# Strip comments
|
||||
$$content =~ s/<!-- .*?-->//sg;
|
||||
return $content;
|
||||
}
|
||||
|
||||
sub url_canonicalize_dots {
|
||||
my ($base, $url) = @_;
|
||||
|
||||
if ($url !~ m{^[^:#?/]+://}) {
|
||||
if ($url =~ m{^//}) {
|
||||
$base =~ m{^[^:#?/]+:}
|
||||
and $url = $& . $url;
|
||||
} elsif ($url =~ m{^/}) {
|
||||
$base =~ m{^[^:#?/]+://[^/#?]*}
|
||||
and $url = $& . $url;
|
||||
} else {
|
||||
uscan_debug "Resolving urls with query part unimplemented"
|
||||
if ($url =~ m/^[#?]/);
|
||||
$base =~ m{^[^:#?/]+://[^/#?]*(?:/(?:[^#?/]*/)*)?} and do {
|
||||
my $base_to_path = $&;
|
||||
$base_to_path .= '/' unless $base_to_path =~ m|/$|;
|
||||
$url = $base_to_path . $url;
|
||||
};
|
||||
}
|
||||
}
|
||||
$url =~ s{^([^:#?/]+://[^/#?]*)(/[^#?]*)}{
|
||||
my ($h, $p) = ($1, $2);
|
||||
$p =~ s{/\.(?:/|$|(?=[#?]))}{/}g;
|
||||
1 while $p =~ s{/(?!\.\./)[^/]*/\.\.(?:/|(?=[#?])|$)}{/}g;
|
||||
$h.$p;}e;
|
||||
$url;
|
||||
}
|
||||
|
||||
sub html_search {
|
||||
my ($self, $content, $patterns, $mangle) = @_;
|
||||
|
||||
# pagenmangle: should not abuse this slow operation
|
||||
if (
|
||||
mangle(
|
||||
$self->watchfile, \$self->line,
|
||||
'pagemangle:\n', [@{ $self->pagemangle }],
|
||||
\$content
|
||||
)
|
||||
) {
|
||||
return undef;
|
||||
}
|
||||
if ( !$self->shared->{bare}
|
||||
and $content =~ m%^<[?]xml%i
|
||||
and $content =~ m%xmlns="http://s3.amazonaws.com/doc/2006-03-01/"%
|
||||
and $content !~ m%<Key><a\s+href%) {
|
||||
# this is an S3 bucket listing. Insert an 'a href' tag
|
||||
# into the content for each 'Key', so that it looks like html (LP: #798293)
|
||||
uscan_warn
|
||||
"*** Amazon AWS special case code is deprecated***\nUse opts=pagemangle rule, instead";
|
||||
$content =~ s%<Key>([^<]*)</Key>%<Key><a href="$1">$1</a></Key>%g;
|
||||
uscan_extra_debug
|
||||
"processed content:\n$content\n[End of processed content] by Amazon AWS special case code";
|
||||
}
|
||||
clean_content(\$content);
|
||||
|
||||
# Is there a base URL given?
|
||||
if ($content =~ /<\s*base\s+[^>]*href\s*=\s*([\"\'])(.*?)\1/i) {
|
||||
$self->parse_result->{urlbase}
|
||||
= url_canonicalize_dots($self->parse_result->{base}, $2);
|
||||
} else {
|
||||
$self->parse_result->{urlbase} = $self->parse_result->{base};
|
||||
}
|
||||
uscan_extra_debug
|
||||
"processed content:\n$content\n[End of processed content] by fix bad HTML code";
|
||||
|
||||
# search hrefs in web page to obtain a list of uversionmangled version and matching download URL
|
||||
{
|
||||
local $, = ',';
|
||||
uscan_verbose "Matching pattern:\n @{$self->{patterns}}";
|
||||
}
|
||||
my @hrefs;
|
||||
while ($content =~ m/<\s*a\s+[^>]*(?<=\s)href\s*=\s*([\"\'])(.*?)\1/sgi) {
|
||||
my $href = $2;
|
||||
$href = fix_href($href);
|
||||
my $href_canonical
|
||||
= url_canonicalize_dots($self->parse_result->{urlbase}, $href);
|
||||
if (defined $self->hrefdecode) {
|
||||
if ($self->hrefdecode eq 'percent-encoding') {
|
||||
uscan_debug "... Decoding from href: $href";
|
||||
$href =~ s/%([A-Fa-f\d]{2})/chr hex $1/eg;
|
||||
$href_canonical =~ s/%([A-Fa-f\d]{2})/chr hex $1/eg;
|
||||
} else {
|
||||
uscan_warn "Illegal value for hrefdecode: "
|
||||
. "$self->{hrefdecode}";
|
||||
return undef;
|
||||
}
|
||||
}
|
||||
uscan_extra_debug "Checking href $href";
|
||||
foreach my $_pattern (@$patterns) {
|
||||
if (my @match = $href =~ /^$_pattern$/) {
|
||||
push @hrefs,
|
||||
parse_href($self, $href_canonical, $_pattern, \@match,
|
||||
$mangle);
|
||||
}
|
||||
uscan_extra_debug "Checking href $href_canonical";
|
||||
if (my @match = $href_canonical =~ /^$_pattern$/) {
|
||||
push @hrefs,
|
||||
parse_href($self, $href_canonical, $_pattern, \@match,
|
||||
$mangle);
|
||||
}
|
||||
}
|
||||
}
|
||||
return @hrefs;
|
||||
}
|
||||
|
||||
sub plain_search {
|
||||
my ($self, $content) = @_;
|
||||
my @hrefs;
|
||||
foreach my $_pattern (@{ $self->patterns }) {
|
||||
while ($content =~ s/.*?($_pattern)//) {
|
||||
push @hrefs, $self->parse_href($1, $_pattern, $2);
|
||||
}
|
||||
}
|
||||
$self->parse_result->{urlbase} = $self->parse_result->{base};
|
||||
return @hrefs;
|
||||
}
|
||||
|
||||
sub parse_href {
|
||||
my ($self, $href, $_pattern, $match, $mangle) = @_;
|
||||
$mangle //= 'uversionmangle';
|
||||
|
||||
my $mangled_version;
|
||||
if ($self->watch_version == 2) {
|
||||
|
||||
# watch_version 2 only recognised one group; the code
|
||||
# below will break version 2 watch files with a construction
|
||||
# such as file-([\d\.]+(-\d+)?) (bug #327258)
|
||||
$mangled_version
|
||||
= ref $match eq 'ARRAY'
|
||||
? $match->[0]
|
||||
: $match;
|
||||
} else {
|
||||
# need the map { ... } here to handle cases of (...)?
|
||||
# which may match but then return undef values
|
||||
if ($self->versionless) {
|
||||
|
||||
# exception, otherwise $mangled_version = 1
|
||||
$mangled_version = '';
|
||||
} else {
|
||||
$mangled_version = join(".",
|
||||
map { $_ if defined($_) }
|
||||
ref $match eq 'ARRAY' ? @$match : $href =~ m&^$_pattern$&);
|
||||
}
|
||||
|
||||
if (
|
||||
mangle(
|
||||
$self->watchfile, \$self->line,
|
||||
"$mangle:", \@{ $self->$mangle },
|
||||
\$mangled_version
|
||||
)
|
||||
) {
|
||||
return ();
|
||||
}
|
||||
}
|
||||
$match = '';
|
||||
if (defined $self->shared->{download_version}) {
|
||||
if ($mangled_version eq $self->shared->{download_version}) {
|
||||
$match = "matched with the download version";
|
||||
}
|
||||
}
|
||||
my $priority = $mangled_version . '-' . get_priority($href);
|
||||
return [$priority, $mangled_version, $href, $match];
|
||||
}
|
||||
|
||||
1;
|
67
lib/Devscripts/Uscan/svn.pm
Normal file
67
lib/Devscripts/Uscan/svn.pm
Normal file
|
@ -0,0 +1,67 @@
|
|||
package Devscripts::Uscan::svn;
|
||||
|
||||
use strict;
|
||||
use Cwd qw/abs_path/;
|
||||
use Devscripts::Uscan::Output;
|
||||
use Devscripts::Uscan::Utils;
|
||||
use Devscripts::Uscan::_vcs;
|
||||
use Dpkg::IPC;
|
||||
use File::Path 'remove_tree';
|
||||
use Moo::Role;
|
||||
|
||||
######################################################
|
||||
# search $newfile $newversion (svn mode/versionless)
|
||||
######################################################
|
||||
sub svn_search {
|
||||
my ($self) = @_;
|
||||
my ($newfile, $newversion);
|
||||
if ($self->versionless) {
|
||||
$newfile = $self->parse_result->{base};
|
||||
spawn(
|
||||
exec => [
|
||||
'svn', 'info',
|
||||
'--show-item', 'last-changed-revision',
|
||||
'--no-newline', $self->parse_result->{base}
|
||||
],
|
||||
wait_child => 1,
|
||||
to_string => \$newversion
|
||||
);
|
||||
chomp($newversion);
|
||||
$newversion = sprintf '0.0~svn%d', $newversion;
|
||||
if (
|
||||
mangle(
|
||||
$self->watchfile, \$self->line,
|
||||
'uversionmangle:', \@{ $self->uversionmangle },
|
||||
\$newversion
|
||||
)
|
||||
) {
|
||||
return undef;
|
||||
}
|
||||
|
||||
}
|
||||
################################################
|
||||
# search $newfile $newversion (svn mode w/tag)
|
||||
################################################
|
||||
elsif ($self->mode eq 'svn') {
|
||||
my @args = ('list', $self->parse_result->{base});
|
||||
($newversion, $newfile)
|
||||
= get_refs($self, ['svn', @args], qr/(.+)/, 'subversion');
|
||||
return undef if !defined $newversion;
|
||||
}
|
||||
return ($newversion, $newfile);
|
||||
}
|
||||
|
||||
sub svn_upstream_url {
|
||||
my ($self) = @_;
|
||||
my $upstream_url = $self->parse_result->{base};
|
||||
if (!$self->versionless) {
|
||||
$upstream_url .= '/' . $self->search_result->{newfile};
|
||||
}
|
||||
return $upstream_url;
|
||||
}
|
||||
|
||||
*svn_newfile_base = \&Devscripts::Uscan::_vcs::_vcs_newfile_base;
|
||||
|
||||
sub svn_clean { }
|
||||
|
||||
1;
|
40
lib/Devscripts/Utils.pm
Normal file
40
lib/Devscripts/Utils.pm
Normal file
|
@ -0,0 +1,40 @@
|
|||
package Devscripts::Utils;
|
||||
|
||||
use strict;
|
||||
use Devscripts::Output;
|
||||
use Dpkg::IPC;
|
||||
use Exporter 'import';
|
||||
|
||||
our @EXPORT = qw(ds_exec ds_exec_no_fail);
|
||||
|
||||
sub ds_exec_no_fail {
|
||||
{
|
||||
local $, = ' ';
|
||||
ds_debug "Execute: @_...";
|
||||
}
|
||||
spawn(
|
||||
exec => [@_],
|
||||
to_file => '/dev/null',
|
||||
wait_child => 1,
|
||||
nocheck => 1,
|
||||
);
|
||||
return $?;
|
||||
}
|
||||
|
||||
sub ds_exec {
|
||||
{
|
||||
local $, = ' ';
|
||||
ds_debug "Execute: @_...";
|
||||
}
|
||||
spawn(
|
||||
exec => [@_],
|
||||
wait_child => 1,
|
||||
nocheck => 1,
|
||||
);
|
||||
if ($?) {
|
||||
local $, = ' ';
|
||||
ds_die "Command failed (@_)";
|
||||
}
|
||||
}
|
||||
|
||||
1;
|
60
lib/Devscripts/Versort.pm
Normal file
60
lib/Devscripts/Versort.pm
Normal file
|
@ -0,0 +1,60 @@
|
|||
# Copyright (C) 1998,2002 Julian Gilbey <jdg@debian.org>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation; either version 2 of the License, or (at
|
||||
# your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
# General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
# The functions in this Perl module are versort, upstream_versort and
|
||||
# deb_versort. They each take as input an array of elements of the form
|
||||
# [version, data, ...] and sort them into decreasing order according to dpkg's
|
||||
# understanding of version sorting. The output is a sorted array. In
|
||||
# upstream_versort, "version" is assumed to be an upstream version number only,
|
||||
# whereas in deb_versort, "version" is assumed to be a Debian version number,
|
||||
# possibly including an epoch and/or a Debian revision. versort is available
|
||||
# for compatibility reasons. It compares versions as Debian versions
|
||||
# (i.e. 1-2-4 < 1-3) but disables checks for wellformed versions.
|
||||
#
|
||||
# The returned array has the greatest version as the 0th array element.
|
||||
|
||||
package Devscripts::Versort;
|
||||
use Dpkg::Version;
|
||||
|
||||
sub versort (@) {
|
||||
return _versort(0, sub { return shift->[0] }, @_);
|
||||
}
|
||||
|
||||
sub deb_versort (@) {
|
||||
return _versort(1, sub { return shift->[0] }, @_);
|
||||
}
|
||||
|
||||
sub upstream_versort (@) {
|
||||
return _versort(0, sub { return "1:" . shift->[0] . "-0" }, @_);
|
||||
}
|
||||
|
||||
sub _versort ($@) {
|
||||
my ($check, $getversion, @namever_pairs) = @_;
|
||||
|
||||
foreach my $pair (@namever_pairs) {
|
||||
unshift(@$pair,
|
||||
Dpkg::Version->new(&$getversion($pair), check => $check));
|
||||
}
|
||||
|
||||
my @sorted = sort { $b->[0] <=> $a->[0] } @namever_pairs;
|
||||
|
||||
foreach my $pair (@sorted) {
|
||||
shift @$pair;
|
||||
}
|
||||
|
||||
return @sorted;
|
||||
}
|
||||
|
||||
1;
|
38
po4a/Makefile
Normal file
38
po4a/Makefile
Normal file
|
@ -0,0 +1,38 @@
|
|||
include ../Makefile.common
|
||||
|
||||
LANGS = de fr pt
|
||||
|
||||
DESC_de/ = Debian-Hilfswerkzeuge
|
||||
DESC_fr/ = Utilitaires Debian
|
||||
DESC_pt/ = Utilitários Debian
|
||||
|
||||
GEN_TR_MAN1S := $(sort $(foreach lang,$(LANGS),$(patsubst %.1,$(lang)/%.$(lang).1,$(GEN_MAN1S))))
|
||||
|
||||
all: translate
|
||||
# GEN_TR_MAN1S needs translate finished, serialize the calls
|
||||
@$(MAKE) transform
|
||||
transform: $(GEN_TR_MAN1S)
|
||||
|
||||
translate: ../doc/devscripts.7
|
||||
po4a --previous --no-backups --keep=0 devscripts-po4a.conf
|
||||
touch $@
|
||||
|
||||
clean: ../doc/devscripts.7
|
||||
po4a --previous --rm-translations --no-backups devscripts-po4a.conf
|
||||
rm -f $(GEN_TR_MAN1S) translate
|
||||
rm -rf $(LANGS)
|
||||
|
||||
../doc/devscripts.7:
|
||||
# po4a translate and clean need ../doc/devscripts.7, rebuild it
|
||||
$(MAKE) -C ../doc devscripts.7
|
||||
|
||||
%.1:: %.pl translate
|
||||
-podchecker $<
|
||||
pod2man --utf8 --center=" " --release="$(DESC_$(dir $@))" $< > $@
|
||||
%.1:: %.pod translate
|
||||
-podchecker $<
|
||||
pod2man --utf8 --center=" " --release="$(DESC_$(dir $@))" $< > $@
|
||||
%.1:: %.dbk translate
|
||||
mkdir -p $(LANGS)
|
||||
xsltproc --nonet -o $@ \
|
||||
/usr/share/sgml/docbook/stylesheet/xsl/nwalsh/manpages/docbook.xsl $<
|
16
po4a/add_de/translator_man.add
Normal file
16
po4a/add_de/translator_man.add
Normal file
|
@ -0,0 +1,16 @@
|
|||
PO4A-HEADER:mode=after;position=^\.SH BESCHREIBUNG;beginboundary=FakePo4aBoundary
|
||||
.SH ÜBERSETZUNG
|
||||
Diese Übersetzung wurde mit dem Werkzeug
|
||||
.B po4a
|
||||
<URL:https://po4a.org/>
|
||||
durch Chris Leick
|
||||
.I c.leick@vollbio.de
|
||||
im Juli 2012 erstellt und vom deutschen Debian-Übersetzer-Team korrekturgelesen.
|
||||
Bitte melden Sie alle Fehler in der Übersetzung an
|
||||
.I debian-l10n-german@lists.debian.org
|
||||
oder als Fehlerbericht an das Paket
|
||||
.IR devscripts .
|
||||
Sie können mit dem folgenden Befehl das englische
|
||||
Original anzeigen
|
||||
.RB "»" "man -L C"
|
||||
.IR "Abschnitt deutsche_Handbuchseite" "«."
|
28
po4a/add_fr/translator_dbk.add
Normal file
28
po4a/add_fr/translator_dbk.add
Normal file
|
@ -0,0 +1,28 @@
|
|||
PO4A-HEADER:mode=after;position=AUTEUR;endboundary=</refsect1>
|
||||
|
||||
<refsect1>
|
||||
<title>TRADUCTION</title>
|
||||
|
||||
<para>
|
||||
Cette page de manuel a été traduite et revue par un ou plusieurs
|
||||
traducteurs dont Cyril Brulebois, Thomas Huriaux, David Prévot et
|
||||
Xavier Guimard.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
L'équipe de traduction a fait le maximum pour réaliser une adaptation
|
||||
française de qualité.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
La version anglaise la plus à jour de ce document est toujours
|
||||
consultable en ajoutant l'option « -L C » à la
|
||||
commande <command>man</command>.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Nʼhésitez pas à signaler à lʼauteur ou à la liste de traduction
|
||||
<email>debian-l10-french@lists.debian.org</email>
|
||||
selon le cas, toute erreur dans cette page de manuel.
|
||||
</para>
|
||||
</refsect1>
|
20
po4a/add_fr/translator_man.add
Normal file
20
po4a/add_fr/translator_man.add
Normal file
|
@ -0,0 +1,20 @@
|
|||
PO4A-HEADER:mode=after;position=^\.SH NOM;beginboundary=FakePo4aBoundary
|
||||
.SH TRADUCTION
|
||||
Ce document est une traduction, maintenue à lʼaide de lʼoutil
|
||||
po4a <\fIhttps://po4a.org/\fR> par lʼéquipe de
|
||||
traduction francophone du projet Debian.
|
||||
|
||||
Plusieurs traducteurs dont Nicolas François, Guillaume Delacour, Cyril
|
||||
Brulebois, Thomas Huriaux, David Prévot et Xavier Guimard ont contribué aux
|
||||
traductions.
|
||||
|
||||
Lʼéquipe de traduction a fait le maximum pour réaliser une adaptation
|
||||
française de qualité.
|
||||
Veuillez signaler toute erreur de traduction en écrivant à
|
||||
.nh
|
||||
<\fIdebian\-l10n\-french@lists.debian.org\fR>
|
||||
.hy
|
||||
ou par un rapport de bogue sur le paquet devscripts.
|
||||
|
||||
La version anglaise la plus à jour de ce document est toujours consultable
|
||||
en ajoutant lʼoption «\ \fB\-L\ C\fR\ » à la commande \fBman\fR.
|
10
po4a/add_fr/translator_pod.add
Normal file
10
po4a/add_fr/translator_pod.add
Normal file
|
@ -0,0 +1,10 @@
|
|||
PO4A-HEADER:mode=after;position=^=head1 NOM;beginboundary=FakePo4aBoundary
|
||||
=head1 TRADUCTION
|
||||
|
||||
Cyril Brulebois <I<cyril.brulebois@enst-bretagne.fr>>, 2006
|
||||
|
||||
Thomas Huriaux <I<thomas.huriaux@gmail.com>>, 2006
|
||||
|
||||
David Prévot <I<david@tilapin.org>>, 2010-2013
|
||||
|
||||
Xavier Guimard <I<yadd@debian.org>>, 2018-2024
|
6
po4a/add_pt/translator_man.add
Normal file
6
po4a/add_pt/translator_man.add
Normal file
|
@ -0,0 +1,6 @@
|
|||
PO4A-HEADER:mode=after;position=^\.SH NOM;beginboundary=FakePo4aBoundary
|
||||
.SH TRADUÇÃO
|
||||
Tradução para Português por Américo Monteiro
|
||||
|
||||
Se encontrar algum defeito nesta tradução
|
||||
comunique-o para a_monteiro@gmx.com
|
146
po4a/devscripts-po4a.conf
Normal file
146
po4a/devscripts-po4a.conf
Normal file
|
@ -0,0 +1,146 @@
|
|||
[po_directory] po/
|
||||
|
||||
# List the documents to translate, their format, their translations
|
||||
# (as well as the addendums to apply to the translations)
|
||||
[type:man] ../scripts/annotate-output.1 \
|
||||
$lang:$lang/annotate-output.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/archpath.1 \
|
||||
$lang:$lang/archpath.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/bts.pl \
|
||||
$lang:$lang/bts.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:pod] ../scripts/build-rdeps.pl \
|
||||
$lang:$lang/build-rdeps.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:pod] ../scripts/chdist.pl \
|
||||
$lang:$lang/chdist.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/checkbashisms.1 \
|
||||
$lang:$lang/checkbashisms.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/cowpoke.1 \
|
||||
$lang:$lang/cowpoke.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/dcmd.1 \
|
||||
$lang:$lang/dcmd.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/dd-list.1 \
|
||||
$lang:$lang/dd-list.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/deb2apptainer.1 \
|
||||
$lang:$lang/deb2apptainer.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/deb2docker.1 \
|
||||
$lang:$lang/deb2docker.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/debc.1 \
|
||||
$lang:$lang/debc.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/debchange.1 \
|
||||
$lang:$lang/debchange.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/debcheckout.pl \
|
||||
$lang:$lang/debcheckout.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/debclean.1 \
|
||||
$lang:$lang/debclean.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/debcommit.pl \
|
||||
$lang:$lang/debcommit.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/debdiff.1 \
|
||||
$lang:$lang/debdiff.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/debdiff-apply.1 \
|
||||
$lang:$lang/debdiff-apply.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/debi.1 \
|
||||
$lang:$lang/debi.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/debrepro.pod \
|
||||
$lang:$lang/debrepro.$lang.pod add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/debrelease.1 \
|
||||
$lang:$lang/debrelease.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:docbook] ../scripts/deb-reversion.dbk \
|
||||
$lang:$lang/deb-reversion.$lang.dbk add_$lang:?add_$lang/translator_dbk.add
|
||||
[type:pod] ../scripts/deb-why-removed.pl \
|
||||
$lang:$lang/deb-why-removed.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/debrsign.1 \
|
||||
$lang:$lang/debrsign.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/debsign.1 \
|
||||
$lang:$lang/debsign.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/debsnap.1 \
|
||||
$lang:$lang/debsnap.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/debuild.1 \
|
||||
$lang:$lang/debuild.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/dep3changelog.1 \
|
||||
$lang:$lang/dep3changelog.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/dep-14-convert-git-branch-names.1 \
|
||||
$lang:$lang/dep-14-convert-git-branch-names.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../doc/devscripts.7 \
|
||||
$lang:$lang/devscripts.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/dget.pl \
|
||||
$lang:$lang/dget.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/diff2patches.1 \
|
||||
$lang:$lang/diff2patches.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/dpkg-depcheck.1 \
|
||||
$lang:$lang/dpkg-depcheck.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/dpkg-genbuilddeps.1 \
|
||||
$lang:$lang/dpkg-genbuilddeps.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../doc/edit-patch.1 \
|
||||
$lang:$lang/edit-patch.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/dscextract.1 \
|
||||
$lang:$lang/dscextract.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/dscverify.1 \
|
||||
$lang:$lang/dscverify.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/getbuildlog.1 \
|
||||
$lang:$lang/getbuildlog.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/git-deborig.pl \
|
||||
$lang:$lang/git-deborig.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/grep-excuses.1 \
|
||||
$lang:$lang/grep-excuses.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/hardening-check.pl \
|
||||
$lang:$lang/hardening-check.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/list-unreleased.1 \
|
||||
$lang:$lang/list-unreleased.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/ltnu.pod \
|
||||
$lang:$lang/ltnu.$lang.pod add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/manpage-alert.1 \
|
||||
$lang:$lang/manpage-alert.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/mass-bug.pl \
|
||||
$lang:$lang/mass-bug.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/mergechanges.1 \
|
||||
$lang:$lang/mergechanges.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/mk-build-deps.pl \
|
||||
$lang:$lang/mk-build-deps.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:pod] ../scripts/mk-origtargz.pl \
|
||||
$lang:$lang/mk-origtargz.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:pod] ../scripts/namecheck.pl \
|
||||
$lang:$lang/namecheck.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/nmudiff.1 \
|
||||
$lang:$lang/nmudiff.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/origtargz.pl \
|
||||
$lang:$lang/origtargz.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/plotchangelog.1 \
|
||||
$lang:$lang/plotchangelog.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/pts-subscribe.1 \
|
||||
$lang:$lang/pts-subscribe.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/rc-alert.1 \
|
||||
$lang:$lang/rc-alert.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/rmadison.pl \
|
||||
$lang:$lang/rmadison.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:pod] ../scripts/sadt.pod \
|
||||
$lang:$lang/sadt.$lang.pod add_$lang:?add_$lang/translator_pod.add
|
||||
[type:pod] ../scripts/salsa.pl \
|
||||
$lang:$lang/salsa.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../doc/suspicious-source.1 \
|
||||
$lang:$lang/suspicious-source.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/svnpath.pl \
|
||||
$lang:$lang/svnpath.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:pod] ../scripts/tagpending.pl \
|
||||
$lang:$lang/tagpending.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:pod] ../scripts/transition-check.pl \
|
||||
$lang:$lang/transition-check.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:pod] ../scripts/uscan.pl \
|
||||
$lang:$lang/uscan.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/uupdate.1 \
|
||||
$lang:$lang/uupdate.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../doc/what-patch.1 \
|
||||
$lang:$lang/what-patch.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/whodepends.1 \
|
||||
$lang:$lang/whodepends.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/who-uploads.1 \
|
||||
$lang:$lang/who-uploads.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:pod] ../scripts/who-permits-upload.pl \
|
||||
$lang:$lang/who-permits-upload.$lang.pl add_$lang:?add_$lang/translator_pod.add
|
||||
[type:man] ../scripts/wnpp-alert.1 \
|
||||
$lang:$lang/wnpp-alert.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../scripts/wnpp-check.1 \
|
||||
$lang:$lang/wnpp-check.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../doc/wrap-and-sort.1 \
|
||||
$lang:$lang/wrap-and-sort.$lang.1 add_$lang:?add_$lang/translator_man.add
|
||||
[type:man] ../doc/devscripts.conf.5 \
|
||||
$lang:$lang/devscripts.conf.$lang.5 add_$lang:?add_$lang/translator_man.add
|
32367
po4a/po/de.po
Normal file
32367
po4a/po/de.po
Normal file
File diff suppressed because it is too large
Load diff
25436
po4a/po/devscripts.pot
Normal file
25436
po4a/po/devscripts.pot
Normal file
File diff suppressed because it is too large
Load diff
25615
po4a/po/fr.po
Normal file
25615
po4a/po/fr.po
Normal file
File diff suppressed because it is too large
Load diff
31875
po4a/po/pt.po
Normal file
31875
po4a/po/pt.po
Normal file
File diff suppressed because it is too large
Load diff
184
scripts/Makefile
Normal file
184
scripts/Makefile
Normal file
|
@ -0,0 +1,184 @@
|
|||
|
||||
include ../Makefile.common
|
||||
DESTDIR =
|
||||
|
||||
define \n
|
||||
|
||||
|
||||
endef
|
||||
|
||||
VERSION_FILE = ../version
|
||||
VERSION = $(shell cat $(VERSION_FILE))
|
||||
|
||||
DEB_VENDOR = $(shell dpkg-vendor --query Vendor)
|
||||
|
||||
PL_FILES := $(wildcard *.pl)
|
||||
SH_FILES = $(wildcard *.sh)
|
||||
SCRIPTS = $(patsubst %.pl,%,$(PL_FILES)) $(patsubst %.sh,%,$(SH_FILES))
|
||||
PL_CHECKS = $(patsubst %.pl,%.pl_check,$(PL_FILES))
|
||||
SH_CHECKS = $(patsubst %.sh,%.sh_check,$(SH_FILES))
|
||||
COMPL_FILES := $(wildcard *.bash_completion)
|
||||
BC_BUILD_DIR:=bash_completion
|
||||
COMPLETION = $(patsubst %.bash_completion,$(BC_BUILD_DIR)/%,$(COMPL_FILES))
|
||||
COMPL_DIR := $(shell pkg-config --variable=completionsdir bash-completion)
|
||||
PKGNAMES := \
|
||||
build-rdeps \
|
||||
dd-list \
|
||||
deb2apptainer \
|
||||
deb2docker \
|
||||
debcheckout \
|
||||
debsnap \
|
||||
dget \
|
||||
getbuildlog \
|
||||
grep-excuses \
|
||||
mass-bug \
|
||||
mk-build-deps \
|
||||
pts-subscribe \
|
||||
pts-unsubscribe \
|
||||
rc-alert \
|
||||
rmadison \
|
||||
transition-check \
|
||||
who-uploads \
|
||||
whodepends \
|
||||
wnpp-alert \
|
||||
wnpp-check \
|
||||
|
||||
GEN_MAN1S += \
|
||||
deb-why-removed.1 \
|
||||
debbisect.1 \
|
||||
debftbfs.1 \
|
||||
debootsnap.1 \
|
||||
debrebuild.1 \
|
||||
debrepro.1 \
|
||||
ltnu.1 \
|
||||
mk-origtargz.1 \
|
||||
salsa.1 \
|
||||
reproducible-check.1 \
|
||||
uscan.1 \
|
||||
|
||||
all: $(SCRIPTS) $(GEN_MAN1S) $(COMPLETION)
|
||||
|
||||
scripts: $(SCRIPTS)
|
||||
|
||||
$(VERSION_FILE):
|
||||
$(MAKE) -C .. version
|
||||
|
||||
%: %.sh
|
||||
|
||||
debchange: debchange.pl $(VERSION_FILE)
|
||||
sed "s/###VERSION###/$(VERSION)/" $< > $@
|
||||
chmod --reference=$< $@
|
||||
ifeq ($(DEB_VENDOR),Ubuntu)
|
||||
# On Ubuntu always default to targeting the release that it's built on,
|
||||
# not the current devel release, since its primary use on stable releases
|
||||
# will be for preparing PPA uploads.
|
||||
sed -i 's/get_ubuntu_devel_distro()/"$(shell lsb_release -cs)"/' $@
|
||||
endif
|
||||
|
||||
%.tmp: %.sh $(VERSION_FILE)
|
||||
sed -e "s/###VERSION###/$(VERSION)/" $< > $@
|
||||
%.tmp: %.pl $(VERSION_FILE)
|
||||
sed -e "s/###VERSION###/$(VERSION)/" $< > $@
|
||||
%: %.tmp
|
||||
cp $< $@
|
||||
chmod +x $@
|
||||
|
||||
%.1: %.pl
|
||||
podchecker $<
|
||||
pod2man --utf8 --center=" " --release="Debian Utilities" $< > $@
|
||||
%.1: %.pod
|
||||
podchecker $<
|
||||
pod2man --utf8 --center=" " --release="Debian Utilities" $< > $@
|
||||
%.1: %.dbk
|
||||
xsltproc --nonet -o $@ \
|
||||
/usr/share/sgml/docbook/stylesheet/xsl/nwalsh/manpages/docbook.xsl $<
|
||||
|
||||
# Syntax checker
|
||||
test_sh: $(SH_CHECKS)
|
||||
%.sh_check: %
|
||||
bash -n $<
|
||||
|
||||
test_pl: $(PL_CHECKS)
|
||||
%.pl_check: %
|
||||
perl -I ../lib -c $<; \
|
||||
|
||||
devscripts/__init__.py: ../debian/changelog
|
||||
# Generate devscripts/__init__.py
|
||||
python3 setup.py build
|
||||
|
||||
test_py: devscripts/__init__.py
|
||||
$(foreach python,$(shell py3versions -r ../debian/control),$(python) -m unittest discover devscripts$(\n))
|
||||
|
||||
debbisect.1: debbisect
|
||||
help2man \
|
||||
--name="bisect snapshot.debian.org" \
|
||||
--version-string=$(VERSION) \
|
||||
--no-info \
|
||||
--no-discard-stderr \
|
||||
./$< >$@
|
||||
|
||||
debftbfs.1: debftbfs
|
||||
help2man \
|
||||
--name="list packages that have FTBFS bugs filed" \
|
||||
--version-string=$(VERSION) \
|
||||
--no-info \
|
||||
--no-discard-stderr \
|
||||
./$< >$@
|
||||
|
||||
debootsnap.1: debootsnap
|
||||
help2man \
|
||||
--name="create debian chroot using snapshot.debian.org" \
|
||||
--version-string=$(VERSION) \
|
||||
--no-info \
|
||||
--no-discard-stderr \
|
||||
./$< >$@
|
||||
|
||||
debrebuild.1: debrebuild
|
||||
help2man \
|
||||
--name="use a buildinfo file and snapshot.d.o to recreate binary packages" \
|
||||
--version-string=$(VERSION) \
|
||||
--no-info \
|
||||
--no-discard-stderr \
|
||||
./$< >$@
|
||||
|
||||
reproducible-check.1: reproducible-check
|
||||
help2man \
|
||||
--name="Reports on the reproducible status of installed packages" \
|
||||
--no-info \
|
||||
--no-discard-stderr \
|
||||
./$< >$@
|
||||
|
||||
$(BC_BUILD_DIR):
|
||||
mkdir $(BC_BUILD_DIR)
|
||||
|
||||
$(COMPLETION): $(BC_BUILD_DIR)/% : %.bash_completion $(BC_BUILD_DIR)
|
||||
cp $< $@
|
||||
|
||||
clean:
|
||||
rm -f devscripts/__init__.py
|
||||
find -name '*.pyc' -delete
|
||||
find -name __pycache__ -delete
|
||||
rm -rf devscripts.egg-info $(BC_BUILD_DIR) .pylint.d
|
||||
rm -f $(SCRIPTS) $(patsubst %,%.tmp,$(SCRIPTS)) \
|
||||
$(GEN_MAN1S) $(SCRIPT_LIBS)
|
||||
rm -rf build
|
||||
|
||||
test: test_pl test_sh test_py
|
||||
|
||||
install: all
|
||||
python3 setup.py install --root="$(DESTDIR)" --no-compile --install-layout=deb
|
||||
cp $(SCRIPTS) $(DESTDIR)$(BINDIR)
|
||||
ln -sf edit-patch $(DESTDIR)$(BINDIR)/add-patch
|
||||
install -d $(DESTDIR)$(COMPL_DIR)
|
||||
cp $(BC_BUILD_DIR)/* $(DESTDIR)$(COMPL_DIR)/
|
||||
for i in $(PKGNAMES); do \
|
||||
ln -sf pkgnames $(DESTDIR)$(COMPL_DIR)/$$i; \
|
||||
done
|
||||
ln -sf debchange $(DESTDIR)$(COMPL_DIR)/dch
|
||||
ln -sf debi $(DESTDIR)$(COMPL_DIR)/debc
|
||||
# Special treatment for run_bisect
|
||||
install -d $(DESTDIR)$(DATA_DIR)/scripts
|
||||
mv $(DESTDIR)$(BINDIR)/run_bisect $(DESTDIR)$(DATA_DIR)/scripts
|
||||
mv $(DESTDIR)$(BINDIR)/run_bisect_qemu $(DESTDIR)$(DATA_DIR)/scripts
|
||||
|
||||
.PHONY: test test_pl test_sh test_py all install clean scripts
|
120
scripts/annotate-output.1
Normal file
120
scripts/annotate-output.1
Normal file
|
@ -0,0 +1,120 @@
|
|||
.TH ANNOTATE-OUTPUT 1 "Debian Utilities" "DEBIAN" \" -*- nroff -*-
|
||||
.SH NAME
|
||||
annotate-output \- annotate program output with time and stream
|
||||
.SH SYNOPSIS
|
||||
\fBannotate\-output\fR [\fIoptions\fR ...] [--] \fIprogram\fR [\fIargs\fR ...]
|
||||
.SH DESCRIPTION
|
||||
\fBannotate\-output\fR executes \fIprogram\fR with \fIargs\fR as arguments
|
||||
and prepends printed lines with a format string followed by an indicator
|
||||
for the stream on which the line was printed followed by a colon and a
|
||||
single space.
|
||||
.br
|
||||
The stream indicators are \fBI\fR for information from
|
||||
\fBannotate\-output\fR as well as \fBO\fR for STDOUT and \fBE\fR for STDERR
|
||||
from \fIprogram\fR.
|
||||
|
||||
|
||||
.SH OPTIONS
|
||||
.TP
|
||||
\fB+FORMAT\fR
|
||||
A format string that may use the conversion specifiers from the
|
||||
\fBdate\fR(1)-utility. The printed string is separated from the following
|
||||
stream indicator by a single space. May be overridden by later options that
|
||||
specify the format string.
|
||||
.br
|
||||
Defaults to "%H:%M:%S".
|
||||
.TP
|
||||
\fB--raw-date-format\fR \fIFORMAT\fR
|
||||
A format string that may use the conversion specifiers from the
|
||||
\fBdate\fR(1)-utility. There is no separator between the printed string and
|
||||
the following stream indicator. May be overridden by later options that
|
||||
specify the format string.
|
||||
.TP
|
||||
\fB--\fR
|
||||
Ends option parsing (unless it is itself an argument to an option).
|
||||
.TP
|
||||
\fB\-h\fR, \fB\-\-help\fR
|
||||
Display a help message.
|
||||
|
||||
.SH EXIT STATUS
|
||||
If \fIprogram\fR is invoked, the exit status of \fBannotate\-output\fR
|
||||
shall be the exit status of \fIprogram\fR; otherwise,
|
||||
\fBannotate\-output\fR shall exit with one of the following values:
|
||||
.TP
|
||||
0
|
||||
\fB\-h\fR or \fB\-\-help\fR was used.
|
||||
.TP
|
||||
125
|
||||
An error occurred in \fBannotate\-output\fR.
|
||||
.TP
|
||||
126
|
||||
\fIprogram\fR was found but could not be invoked.
|
||||
.TP
|
||||
127
|
||||
\fIprogram\fR could not be found or was not specified.
|
||||
|
||||
.SH EXAMPLE
|
||||
|
||||
.nf
|
||||
$ annotate-output make
|
||||
21:41:21 I: Started make
|
||||
21:41:21 O: gcc \-Wall program.c
|
||||
21:43:18 E: program.c: Couldn't compile, and took me ages to find out
|
||||
21:43:19 E: collect2: ld returned 1 exit status
|
||||
21:43:19 E: make: *** [all] Error 1
|
||||
21:43:19 I: Finished with exitcode 2
|
||||
.fi
|
||||
|
||||
.SH CAVEATS AND BUGS
|
||||
Since STDOUT and STDERR are processed in parallel, it can happen that
|
||||
some lines received on STDOUT will show up before later-printed STDERR
|
||||
lines (and vice-versa).
|
||||
.br
|
||||
This is unfortunately very hard to fix with the current annotation
|
||||
strategy. A fix would involve switching to PTRACE'ing the process.
|
||||
Giving nice a (much) higher priority over \fIprogram\fR could
|
||||
however cause this behaviour to show up less frequently.
|
||||
|
||||
\fBannotate\-output\fR expects \fIprogram\fR to output (text) lines (as
|
||||
specified by POSIX) to STDOUT and STDERR.
|
||||
.br
|
||||
In particular, it leads to undefined behaviour when lines are printed that
|
||||
contain NUL bytes. It further may lead to undefined behaviour when lines
|
||||
are printed that contain bytes that do not form valid characters in the
|
||||
current locale.
|
||||
|
||||
When an interactive \fIprogram\fR asks for input, the question might not be
|
||||
shown until after you have answered it. This will give the impression that
|
||||
\fIprogram\fR has hung, while it has not.
|
||||
|
||||
\fBannotate\-output\fR is implemented as a script in the Shell Command
|
||||
Language. Shells typically set various (shell) variables when started and
|
||||
may set the `export` attribute on some of them. They further initialise
|
||||
(shell) variables from their own environment (as set by the caller of the
|
||||
shell respectively the caller of \fBannotate\-output\fR) and set the
|
||||
`export` attribute on them.
|
||||
.br
|
||||
It follows from this, that when the caller of \fBannotate\-output\fR wants
|
||||
to set the environment (variables) of \fIprogram\fR, they may get
|
||||
overridden or additional ones may get added by the shell.
|
||||
.br
|
||||
Further, environment variables are in principle allowed to have names (for
|
||||
example `.`) that are not valid shell variable names. POSIX does not
|
||||
specify whether or not such environment variables are exported to programs
|
||||
invoked from the shell. No assumptions can thus be made on whether such
|
||||
environment variables will be exported correctly or at all to \fIprogram\fR.
|
||||
|
||||
.SH "SEE ALSO"
|
||||
\fBdate\fR(1)
|
||||
|
||||
.SH SUPPORT
|
||||
\fBannotate\-output\fR is community-supported (meaning: you'll need to fix
|
||||
it yourself). Patches are however appreciated, as is any feedback
|
||||
(positive or negative).
|
||||
|
||||
.SH AUTHOR
|
||||
This manual page was written by Jeroen van Wolffelaar <jeroen@wolffelaar.nl>
|
||||
and can be redistributed under the terms of the GPL version 2.
|
||||
The \fBannotate-output\fR script itself was re-written by Johannes Schauer
|
||||
Marin Rodrigues <josch@debian.org> and can be redistributed under the terms
|
||||
of the Expat license.
|
140
scripts/annotate-output.sh
Executable file
140
scripts/annotate-output.sh
Executable file
|
@ -0,0 +1,140 @@
|
|||
#!/bin/sh
|
||||
|
||||
# Copyright 2019-2023 Johannes Schauer Marin Rodrigues <josch@debian.org>
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in
|
||||
# all copies or substantial portions of the Software.
|
||||
|
||||
set -eu
|
||||
|
||||
# TODO: Switch from using `/usr/bin/printf` to the (likely built-in) `printf`
|
||||
# once POSIX has standardised `%q` for that
|
||||
# (see https://austingroupbugs.net/view.php?id=1771) and `dash`
|
||||
# implemented it.
|
||||
|
||||
define_get_prefix() {
|
||||
eval " get_prefix() {
|
||||
/usr/bin/printf '%q' $(/usr/bin/printf '%q' "$1")
|
||||
}"
|
||||
}
|
||||
|
||||
define_handler_with_date_conversion_specifiers() {
|
||||
eval " handler() {
|
||||
while IFS= read -r line; do
|
||||
printf '%s%s: %s\\n' \"\$(date $(/usr/bin/printf '%q' "$1") )\" \"\$1\" \"\$line\"
|
||||
done
|
||||
if [ -n \"\$line\" ]; then
|
||||
printf '%s%s: %s' \"\$(date $(/usr/bin/printf '%q' "$1") )\" \"\$1\" \"\$line\"
|
||||
fi
|
||||
}"
|
||||
define_get_prefix "${1#+}"
|
||||
}
|
||||
|
||||
define_handler_with_plain_prefix() {
|
||||
eval " handler() {
|
||||
while IFS= read -r line; do
|
||||
printf '%s%s: %s\\n' $(/usr/bin/printf '%q' "$1") \"\$1\" \"\$line\"
|
||||
done
|
||||
if [ -n \"\$line\" ]; then
|
||||
printf '%s%s: %s' $(/usr/bin/printf '%q' "$1") \"\$1\" \"\$line\"
|
||||
fi
|
||||
}"
|
||||
define_get_prefix "$1"
|
||||
}
|
||||
|
||||
usage() {
|
||||
printf \
|
||||
'Usage: %s [OPTIONS ...] [--] PROGRAM [ARGS ...]
|
||||
Executes PROGRAM with ARGS as arguments and prepends printed lines with a format
|
||||
string, a stream indicator and `: `.
|
||||
|
||||
Options:
|
||||
+FORMAT
|
||||
A format string that may use the conversion specifiers from the `date`(1)-
|
||||
utility.
|
||||
The printed string is separated from the following stream indicator by a
|
||||
single space.
|
||||
Defaults to `%%H:%%M:%%S`.
|
||||
--raw-date-format FORMAT
|
||||
A format string that may use the conversion specifiers from the `date`(1)-
|
||||
utility.
|
||||
The printed string is not separated from the following stream indicator.
|
||||
-h
|
||||
--help
|
||||
Display this help message.
|
||||
' "${0##*/}"
|
||||
}
|
||||
|
||||
define_handler_with_date_conversion_specifiers '+%H:%M:%S '
|
||||
while [ -n "${1-}" ]; do
|
||||
case "$1" in
|
||||
+*%*)
|
||||
define_handler_with_date_conversion_specifiers "$1 "
|
||||
shift
|
||||
;;
|
||||
+*)
|
||||
define_handler_with_plain_prefix "${1#+} "
|
||||
shift
|
||||
;;
|
||||
--raw-date-format)
|
||||
if [ "$#" -lt 2 ]; then
|
||||
printf '%s: The `--raw-date-format`-option requires an argument.\n' "${0##*/}" >&2
|
||||
exit 125
|
||||
fi
|
||||
case "$2" in
|
||||
*%*) define_handler_with_date_conversion_specifiers "+$2";;
|
||||
*) define_handler_with_plain_prefix "${2#+}";;
|
||||
esac
|
||||
shift 2
|
||||
;;
|
||||
-h|--help)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
--)
|
||||
shift
|
||||
break
|
||||
;;
|
||||
*)
|
||||
break
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
if [ "$#" -lt 1 ]; then
|
||||
printf '%s: No program to be executed was specified.\n' "${0##*/}" >&2
|
||||
exit 127
|
||||
fi
|
||||
|
||||
printf 'I: annotate-output %s\n' '###VERSION###'
|
||||
printf 'I: prefix='
|
||||
get_prefix
|
||||
printf '\n'
|
||||
{ printf 'Started'; /usr/bin/printf ' %q' "$@"; printf '\n'; } | handler I
|
||||
|
||||
# The following block redirects FD 2 (STDERR) to FD 1 (STDOUT) which is then
|
||||
# processed by the STDERR handler. It redirects FD 1 (STDOUT) to FD 4 such
|
||||
# that it can later be moved to FD 1 (STDOUT) and handled by the STDOUT handler.
|
||||
# The exit status of the program gets written to FD 2 (STDERR) which is then
|
||||
# captured to produce the correct exit status as the last step of the pipe.
|
||||
# Both the STDOUT and STDERR handler output to FD 3 such that after exiting
|
||||
# with the correct exit code, FD 3 can be redirected to FD 1 (STDOUT).
|
||||
{
|
||||
{
|
||||
{
|
||||
{
|
||||
{
|
||||
"$@" 2>&1 1>&4 3>&- 4>&-; printf "$?\n" >&2;
|
||||
} | handler E >&3;
|
||||
} 4>&1 | handler O >&3;
|
||||
} 2>&1;
|
||||
} | { IFS= read -r xs; exit "$xs"; };
|
||||
} 3>&1 && { printf 'Finished with exitcode 0\n' | handler I; exit 0; } \
|
||||
|| { err="$?"; printf "Finished with exitcode $err\n" | handler I; exit "$err"; }
|
63
scripts/archpath.1
Normal file
63
scripts/archpath.1
Normal file
|
@ -0,0 +1,63 @@
|
|||
.TH ARCHPATH 1 "Debian Utilities" "DEBIAN" \" -*- nroff -*-
|
||||
.SH NAME
|
||||
archpath \- output arch (tla/Bazaar) archive names, with support for branches
|
||||
.SH SYNOPSIS
|
||||
.B archpath
|
||||
.br
|
||||
.B archpath
|
||||
.I branch
|
||||
.br
|
||||
.B archpath
|
||||
.IR branch \fB--\fI version
|
||||
.SH DESCRIPTION
|
||||
.B archpath
|
||||
is intended to be run in an arch (tla or Bazaar) working copy.
|
||||
.PP
|
||||
In its simplest usage,
|
||||
.B archpath
|
||||
with no parameters outputs the package name
|
||||
(archive/category--branch--version) associated with the working copy.
|
||||
.PP
|
||||
If a parameter is given, it may either be a branch--version, in which case
|
||||
.B archpath
|
||||
will output a corresponding package name in the current archive and
|
||||
category, or a plain branch name (without \(oq--\(dq), in which case
|
||||
.B archpath
|
||||
will output a corresponding package name in the current archive and
|
||||
category and with the same version as the current working copy.
|
||||
.PP
|
||||
This is useful for branching.
|
||||
For example, if you're using Bazaar and you want to create a branch for a
|
||||
new feature, you might use a command like this:
|
||||
.PP
|
||||
.RS
|
||||
.nf
|
||||
.ft CW
|
||||
baz branch $(archpath) $(archpath new-feature)
|
||||
.ft R
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
Or if you want to tag your current code onto a \(oqreleases\(cq branch as
|
||||
version 1.0, you might use a command like this:
|
||||
.PP
|
||||
.RS
|
||||
.nf
|
||||
.ft CW
|
||||
baz branch $(archpath) $(archpath releases--1.0)
|
||||
.ft R
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
That's much easier than using \(oqbaz tree-version\(cq to look up the
|
||||
package name and manually modifying the result.
|
||||
.SH AUTHOR
|
||||
.B archpath
|
||||
was written by
|
||||
.na
|
||||
Colin Watson <cjwatson@debian.org>.
|
||||
.ad
|
||||
Like
|
||||
.BR archpath ,
|
||||
this manual page is released under the GNU General Public License,
|
||||
version 2 or later.
|
46
scripts/archpath.sh
Executable file
46
scripts/archpath.sh
Executable file
|
@ -0,0 +1,46 @@
|
|||
#!/bin/bash
|
||||
|
||||
# Output arch (tla/Bazaar) archive names, with support for branches
|
||||
|
||||
# Copyright (C) 2005 Colin Watson <cjwatson@debian.org>
|
||||
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation; either version 2, or (at your option) any
|
||||
# later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
set -e
|
||||
|
||||
# Which arch implementation should we use?
|
||||
if type baz > /dev/null 2>&1; then
|
||||
PROGRAM=baz
|
||||
else
|
||||
PROGRAM=tla
|
||||
fi
|
||||
|
||||
WANTED="$1"
|
||||
ME="$($PROGRAM tree-version)"
|
||||
|
||||
if [ "$WANTED" ]; then
|
||||
ARCHIVE="$($PROGRAM parse-package-name --arch "$ME")"
|
||||
CATEGORY="$($PROGRAM parse-package-name --category "$ME")"
|
||||
case $WANTED in
|
||||
*--*)
|
||||
echo "$ARCHIVE/$CATEGORY--$WANTED"
|
||||
;;
|
||||
*)
|
||||
VERSION="$($PROGRAM parse-package-name --vsn "$ME")"
|
||||
echo "$ARCHIVE/$CATEGORY--$WANTED--$VERSION"
|
||||
;;
|
||||
esac
|
||||
else
|
||||
echo "$ME"
|
||||
fi
|
319
scripts/bts.bash_completion
Normal file
319
scripts/bts.bash_completion
Normal file
|
@ -0,0 +1,319 @@
|
|||
# /usr/share/bash-completion/completions/bts
|
||||
# Bash command completion for ‘bts(1)’.
|
||||
# Documentation: ‘bash(1)’, section “Programmable Completion”.
|
||||
|
||||
# Copyright © 2015, Nicholas Bamber <nicholas@periapt.co.uk>
|
||||
|
||||
_get_version_from_package()
|
||||
{
|
||||
local _pkg=$1
|
||||
[[ -n $_pkg ]] || return
|
||||
apt-cache madison $_pkg 2> /dev/null | cut -d'|' -f2 | sort | uniq | paste -s -d' '
|
||||
}
|
||||
|
||||
# This works really well unless someone sets up nasty firewall rules like:
|
||||
# sudo iptables -A OUTPUT -d 206.12.19.140 -j DROP
|
||||
# sudo iptables -A OUTPUT -d 140.211.166.26 -j DROP
|
||||
# These block access to the Debian bugs SOAP interface.
|
||||
# Hence we need a timeout.
|
||||
# Of course if the SOAP interface is blocked then so is the caching interface.
|
||||
# So really this would only affect someone who only accidentally hit the TAB key.
|
||||
_get_version_from_bug()
|
||||
{
|
||||
local -i _bug=$1
|
||||
_get_version_from_package $( bts --soap-timeout=2 status $_bug fields:package 2> /dev/null | cut -f2 )
|
||||
}
|
||||
|
||||
_suggest_packages()
|
||||
{
|
||||
apt-cache --no-generate pkgnames "$1" 2> /dev/null
|
||||
}
|
||||
|
||||
_suggest_bugs()
|
||||
{
|
||||
bts --offline listcachedbugs "$1" 2> /dev/null
|
||||
}
|
||||
|
||||
_bts()
|
||||
{
|
||||
local cur prev words cword
|
||||
_init_completion -n = || return
|
||||
|
||||
# Note:
|
||||
# The long lists of subcommands are not the same and not necessarily to be kept in sync.
|
||||
# The first is used to suggest commands after a '.' or ','.
|
||||
# The second is to hook in special handling (which may be as little as admitting we
|
||||
# we can't handle it further) or the default special handling (list of bug ids).
|
||||
# This also includes "by" and "with" which are not even subcommands.
|
||||
# The third is similar to the first - what to suggest after the bts command (and options).
|
||||
# but this includes the "help" and "version" commands.
|
||||
|
||||
# A sequence of bts commands can be on one command line separated by "." or ",".
|
||||
if [[ $prev == @(.|,) ]]; then
|
||||
COMPREPLY=( $( compgen -W 'show bugs unmerge select status clone done reopen archive unarchive retitle summary submitter reassign found notfound fixed notfixed block unblock merge forcemerge tags affects user usertags claim unclaim severity forwarded notforwarded package limit owner noowner subscribe unsubscribe reportspam spamreport' -- "$cur" ) )
|
||||
return 0
|
||||
fi
|
||||
|
||||
# Identify the last command in the command line.
|
||||
local special punctuation i
|
||||
for (( i=${#words[@]}-1; i > 0; i-- )); do
|
||||
if [[ ${words[i]} == @(show|bugs|select|limit|unmerge|status|clone|done|reopen|archive|unarchive|retitle|summary|submitter|reassign|found|notfound|fixed|notfixed|block|unblock|merge|forcemerge|tags|affects|user|usertags|claim|unclaim|severity|forwarded|notforwarded|package|owner|noowner|subscribe|unsubscribe|reportspam|spamreport|cache|cleancache|by|with) ]]; then
|
||||
special=${words[i]}
|
||||
break
|
||||
fi
|
||||
if [[ ${words[i]} == @(+|-|=) ]]; then
|
||||
punctuation=${words[i]}
|
||||
fi
|
||||
done
|
||||
|
||||
if [[ -n $special ]]; then
|
||||
|
||||
# The command separator must be surrounded by white space.
|
||||
if [[ "$cur" == @(,|.) ]]; then
|
||||
COMPREPLY=( $cur )
|
||||
return 0
|
||||
fi
|
||||
|
||||
case $special in
|
||||
show|bugs)
|
||||
# bugs/show supports a few limited options
|
||||
# but as args we accept bug ids, package names and release-critical
|
||||
if [[ "$cur" == -* ]]; then
|
||||
COMPREPLY=( $( compgen -W '-o --offline --online -m --mbox \
|
||||
--no-cache --cache' -- "$cur" ) )
|
||||
elif [[ "$cur" == release-critical/* ]]; then
|
||||
local _pkg=${cur#release-critical/}
|
||||
COMPREPLY=( $( _suggest_packages "$_pkg" | sed -e's!^!release-critical/!' ) )
|
||||
else
|
||||
COMPREPLY=( $( compgen -W 'release-critical RC' -- "$cur" ) \
|
||||
$( _suggest_bugs "$cur" ) \
|
||||
$( _suggest_packages "$cur" ) )
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
status)
|
||||
# we accept "verbose" and bug ids
|
||||
COMPREPLY=( $( compgen -W 'verbose' -- "$cur" ) \
|
||||
$( _suggest_bugs "$cur" ) )
|
||||
return 0
|
||||
;;
|
||||
clone)
|
||||
# we accept 1 bug id and then generate new clone ids
|
||||
if [[ "$prev" == +([0-9]) ]]; then
|
||||
COMPREPLY=( $( compgen -W '-1' -- "$cur" ) )
|
||||
elif [[ "$prev" == -+([0-9]) ]]; then
|
||||
local -i j
|
||||
(( j=$prev-1 ))
|
||||
COMPREPLY=( $( compgen -W $j -- "$cur" ) )
|
||||
else
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
done|found|notfound|fixed|notfixed)
|
||||
# Try to guess the version
|
||||
if [[ "$prev" == +([0-9]) ]]; then
|
||||
local _versions=$( _get_version_from_bug $prev )
|
||||
if [[ -n $_versions ]]; then
|
||||
COMPREPLY=( $( compgen -W $_versions -- "$cur" ) )
|
||||
else
|
||||
COMPREPLY=( )
|
||||
fi
|
||||
else
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
reopen|claim|unclaim|owner|subscribe|unsubscribe)
|
||||
if [[ "$prev" == +([0-9]) && -n $DEBEMAIL ]]; then
|
||||
COMPREPLY=( $( compgen -W $DEBEMAIL -- "$cur" ) )
|
||||
else
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
reassign)
|
||||
# Must have at least one bug id.
|
||||
# Once we have a package name, all that remains is an optional version.
|
||||
if [[ "$prev" == $special ]]; then
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
elif [[ "$prev" == +([0-9]) ]]; then
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) \
|
||||
$( _suggest_packages "$cur" ) )
|
||||
else
|
||||
local _versions=$( _get_version_from_package $prev )
|
||||
COMPREPLY=( $( compgen -W $_versions -- "$cur" ) )
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
block|unblock)
|
||||
# Must have at least one bug id.
|
||||
if [[ "$prev" == $special ]]; then
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
elif [[ "$prev" == +([0-9]) ]]; then
|
||||
COMPREPLY=( $( compgen -W 'by with' -- "$cur" ) )
|
||||
else
|
||||
COMPREPLY=( )
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
unmerge|forwarded|notforwarded|noowner)
|
||||
# Must have at most one bug id.
|
||||
if [[ "$prev" == $special ]]; then
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
else
|
||||
COMPREPLY=( )
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
tags)
|
||||
# Must have one bug id.
|
||||
if [[ "$prev" == $special ]]; then
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
elif [[ -n $punctuation ]]; then
|
||||
# The official list is mirrored
|
||||
# https://www.debian.org/Bugs/server-control#tag
|
||||
# in the variable @gTags; we copy it verbatim here.
|
||||
COMPREPLY=( $( compgen -W 'patch wontfix moreinfo unreproducible fixed potato woody sid help security upstream pending sarge sarge-ignore experimental d-i confirmed ipv6 lfs fixed-in-experimental fixed-upstream l10n newcomer a11y ftbfs etch etch-ignore lenny lenny-ignore squeeze squeeze-ignore wheezy wheezy-ignore jessie jessie-ignore stretch stretch-ignore buster buster-ignore bullseye bullseye-ignore' -- "$cur" ) )
|
||||
else
|
||||
COMPREPLY=()
|
||||
COMPREPLY[0]='= '
|
||||
COMPREPLY[1]='+ '
|
||||
COMPREPLY[2]='- '
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
affects)
|
||||
# Must have one bug id.
|
||||
if [[ "$prev" == $special ]]; then
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
elif [[ -n $punctuation ]]; then
|
||||
COMPREPLY=( $( _suggest_packages "$cur" ) )
|
||||
else
|
||||
COMPREPLY=()
|
||||
COMPREPLY[0]='= '
|
||||
COMPREPLY[1]='+ '
|
||||
COMPREPLY[2]='- '
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
user)
|
||||
if [[ "$prev" == $special && -n $DEBEMAIL ]]; then
|
||||
COMPREPLY=( $( compgen -W $DEBEMAIL -- "$cur" ) )
|
||||
else
|
||||
COMPREPLY=( )
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
usertags)
|
||||
# Must have one bug id.
|
||||
if [[ "$prev" == $special ]]; then
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
elif [[ -z $punctuation ]]; then
|
||||
COMPREPLY=()
|
||||
COMPREPLY[0]='= '
|
||||
COMPREPLY[1]='+ '
|
||||
COMPREPLY[2]='- '
|
||||
else
|
||||
COMPREPLY=()
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
severity)
|
||||
if [[ "$prev" == $special ]]; then
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
elif [[ "$prev" == +([0-9]) ]]; then
|
||||
COMPREPLY=( $( compgen -W 'wishlist minor normal important serious \
|
||||
grave critical' -- "$cur" ) )
|
||||
else
|
||||
COMPREPLY=()
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
select|limit)
|
||||
# can't handle ":". Give up for now.
|
||||
COMPREPLY=( )
|
||||
return 0
|
||||
;;
|
||||
package)
|
||||
COMPREPLY=( $( _suggest_packages "$cur" ) )
|
||||
return 0
|
||||
;;
|
||||
cache)
|
||||
# cache supports a few limited options
|
||||
# but as args we accept bug ids, package names and release-critical
|
||||
if [[ "$prev" == --cache-mode ]]; then
|
||||
COMPREPLY=( $( compgen -W 'min mbox full' -- "$cur" ) )
|
||||
elif [[ "$cur" == release-critical/* ]]; then
|
||||
local _pkg=${cur#release-critical/}
|
||||
COMPREPLY=( $( _suggest_packages "$_pkg" | sed -e's!^!release-critical/!' ) )
|
||||
elif [[ "$cur" == -* ]]; then
|
||||
COMPREPLY=( $( compgen -W '--cache-mode --force-refresh -f \
|
||||
--include-resolved -q --quiet' -- "$cur" ) )
|
||||
else
|
||||
COMPREPLY=( $( compgen -W 'release-critical RC' -- "$cur" ) \
|
||||
$( _suggest_packages "$cur" ) )
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
cleancache)
|
||||
if [[ "$prev" == $special ]]; then
|
||||
COMPREPLY=( $( compgen -W 'ALL' -- "$cur" ) \
|
||||
$( _suggest_bugs "$cur" ) \
|
||||
$( _suggest_packages "$cur" ) )
|
||||
else
|
||||
COMPREPLY=( )
|
||||
fi
|
||||
return 0
|
||||
;;
|
||||
*)
|
||||
COMPREPLY=( $( _suggest_bugs "$cur" ) )
|
||||
return 0
|
||||
;;
|
||||
esac
|
||||
fi
|
||||
|
||||
case $prev in
|
||||
--cache-mode)
|
||||
COMPREPLY=( $( compgen -W 'min mbox full' -- "$cur" ) )
|
||||
return 0
|
||||
;;
|
||||
--cache-delay)
|
||||
COMPREPLY=( $( compgen -W '5 60 120 240 600' -- "$cur" ) )
|
||||
return 0
|
||||
;;
|
||||
esac
|
||||
|
||||
if [[ "$cur" == -* ]]; then
|
||||
COMPREPLY=( $( compgen -W '-o --offline --online -n --no-action --cache --no-cache --cache-mode --cache-delay --mbox --no-use-default-cc --mutt --no-mutt -f --force-refresh --no-force-refresh --only-new --include-resolved --no-include-resolved --no-ack --ack -i --interactive --force-interactivei --no-interactive -q --quiet' -- "$cur" ) )
|
||||
else
|
||||
COMPREPLY=( $( compgen -W 'show bugs unmerge select status clone done reopen archive unarchive retitle summary submitter reassign found notfound fixed notfixed block unblock merge forcemerge tags affects user usertags claim unclaim severity forwarded notforwarded package limit owner noowner subscribe unsubscribe reportspam spamreport cache cleancache version help' -- "$cur" ) )
|
||||
fi
|
||||
|
||||
# !!! not handled !!!
|
||||
# --mailreader=READER
|
||||
# --cc-addr=CC_EMAIL_ADDRESS
|
||||
# --use-default-cc
|
||||
# --sendmail=SENDMAILCMD
|
||||
# --smtp-host=SMTPHOST
|
||||
# --smtp-username=USERNAME
|
||||
# --smtp-password=PASSWORD
|
||||
# --smtp-helo=HELO
|
||||
# --bts-server
|
||||
# --no-conf, --noconf
|
||||
#
|
||||
# anything with colons for now
|
||||
# for similar reasons having issues with tags XXXX =
|
||||
# no special handling for select
|
||||
|
||||
return 0
|
||||
} &&
|
||||
complete -F _bts bts
|
||||
|
||||
|
||||
# Local variables:
|
||||
# coding: utf-8
|
||||
# mode: shell-script
|
||||
# indent-tabs-mode: nil
|
||||
# End:
|
||||
# vim: fileencoding=utf-8 filetype=sh expandtab shiftwidth=4 :
|
4352
scripts/bts.pl
Executable file
4352
scripts/bts.pl
Executable file
File diff suppressed because it is too large
Load diff
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue