[OE-core] [PATCH 1/5] archiver.bbclass: Handle gitsm URLs in the mirror archiver
Paul Barker
pbarker at konsulko.com
Wed Mar 11 11:50:25 UTC 2020
On Wed, 11 Mar 2020 11:38:44 +0000
Richard Purdie <richard.purdie at linuxfoundation.org> wrote:
> On Wed, 2020-03-11 at 11:31 +0000, Paul Barker wrote:
> > On Tue, 10 Mar 2020 23:16:38 +0000
> > Richard Purdie <richard.purdie at linuxfoundation.org> wrote:
> >
> > > On Mon, 2020-03-09 at 14:21 +0000, Paul Barker wrote:
> > > > To fully archive a `gitsm://` entry in SRC_URI we need to also capture
> > > > the submodules recursively. If shallow mirror tarballs are found, they
> > > > must be temporarily extracted so that the submodules can be determined.
> > > >
> > > > Signed-off-by: Paul Barker <pbarker at konsulko.com>
> > > > ---
> > > > meta/classes/archiver.bbclass | 31 ++++++++++++++++++++++++++-----
> > > > 1 file changed, 26 insertions(+), 5 deletions(-)
> > > >
> > > > diff --git a/meta/classes/archiver.bbclass b/meta/classes/archiver.bbclass
> > > > index 013195df7d..fef7ad4f62 100644
> > > > --- a/meta/classes/archiver.bbclass
> > > > +++ b/meta/classes/archiver.bbclass
> > > > @@ -306,7 +306,7 @@ python do_ar_configured() {
> > > > }
> > > >
> > > > python do_ar_mirror() {
> > > > - import subprocess
> > > > + import shutil, subprocess, tempfile
> > > >
> > > > src_uri = (d.getVar('SRC_URI') or '').split()
> > > > if len(src_uri) == 0:
> > > > @@ -337,12 +337,10 @@ python do_ar_mirror() {
> > > >
> > > > bb.utils.mkdirhier(destdir)
> > > >
> > > > - fetcher = bb.fetch2.Fetch(src_uri, d)
> > > > -
> > > > - for url in fetcher.urls:
> > > > + def archive_url(fetcher, url):
> > > > if is_excluded(url):
> > > > bb.note('Skipping excluded url: %s' % (url))
> > > > - continue
> > > > + return
> > > >
> > > > bb.note('Archiving url: %s' % (url))
> > > > ud = fetcher.ud[url]
> > > > @@ -376,6 +374,29 @@ python do_ar_mirror() {
> > > > bb.note('Copying source mirror')
> > > > cmd = 'cp -fpPRH %s %s' % (localpath, destdir)
> > > > subprocess.check_call(cmd, shell=True)
> > > > +
> > > > + if url.startswith('gitsm://'):
> > > > + def archive_submodule(ud, url, module, modpath, workdir, d):
> > > > + url += ";bareclone=1;nobranch=1"
> > > > + newfetch = bb.fetch2.Fetch([url], d, cache=False)
> > > > +
> > > > + for url in newfetch.urls:
> > > > + archive_url(newfetch, url)
> > > > +
> > > > + # If we're using a shallow mirror tarball it needs to be unpacked
> > > > + # temporarily so that we can examine the .gitmodules file
> > > > + if ud.shallow and os.path.exists(ud.fullshallow) and ud.method.need_update(ud, d):
> > > > + tmpdir = tempfile.mkdtemp(dir=d.getVar("DL_DIR"))
> > > > + subprocess.check_call("tar -xzf %s" % ud.fullshallow, cwd=tmpdir, shell=True)
> > > > + ud.method.process_submodules(ud, tmpdir, archive_submodule, d)
> > > > + shutil.rmtree(tmpdir)
> > > > + else:
> > > > + ud.method.process_submodules(ud, ud.clonedir, archive_submodule, d)
> > > > +
> > > > + fetcher = bb.fetch2.Fetch(src_uri, d, cache=False)
> > > > +
> > > > + for url in fetcher.urls:
> > > > + archive_url(fetcher, url)
> > > > }
> > >
> > > I can't help feeling that this is basically a sign the fetcher is
> > > broken.
> > >
> > > What should really happen here is that there should be a method in the
> > > fetcher we call into.
> > >
> > > Instead we're teaching code how to hack around the fetcher. Would it be
> > > possible to add some API we could call into here and maintain integrity
> > > of the fetcher API?
> >
> > This is gitsm-specific so the process_submodules method is probably the
> > correct fetcher API. We need to call back into an archiver-supplied function
> > for each submodule that is found.
> >
> > I guess process_submodules could do the temporary unpacking of the shallow
> > archive and then this code would be simplified. Is that what you had in mind?
>
>
> Nearly. The "operation" here is similar to "download" or "unpack" but
> amounts to "make a mirror copy". Should the fetcher have such a method,
> which would then have the fetcher implementation details in the
> fetchers themselves?
I structured things this way after the discussions we've had previously about
not wanting to add too many new code paths to the fetcher. I'd also like to
keep the logic in a bbclass as much as possible so that it can be more easily
carried as a local backport to earlier Yocto Project releases.
I do see your point though, this is liable to grow warts over time as special
cases are added for different fetchers.
The cause of the warts here is that the gitsm fetcher downloads and creates
mirror tarballs for sources which aren't listed in SRC_URI. The archiver
would be simpler if we could assume that all sources are included in SRC_URI.
Perhaps the solution is not to add a "make a mirror copy" API but instead add
an "expand SRC_URI with any dependencies" API that the archiver can call
before it iterates over the list of sources.
--
Paul Barker
Konsulko Group
More information about the Openembedded-core
mailing list