[isabelle-dev] Future and maintainance of ~isabelle/contrib_devel at TUM NFS
krauss at in.tum.de
Sun Jul 1 23:53:51 CEST 2012
On 06/29/2012 04:37 PM, Makarius wrote:
> I did not know this "build artifacts" business yet, but after some web
> browsing I now understand for what SVN used to be abused in the past.
> Here is a comparison matrix for the 3 main players in the field (from
> the Maven perspective):
> Artifactory appears to win in most respects; this is consistent with
> what I've seen on Stackoverflow discussions etc. about the same question
> "foo vs. bar vs. baz" in artifact management.
But almost none of the categories discussed are even remotely relevant
> BTW Mercurial classifies largefiles (and subrepositories) as "Features
> of Last Resort" http://mercurial.selenic.com/wiki/FeaturesOfLastResort
> What we have is a monotonic store
> of "artifacts", where certain Isabelle versions take a projection
> according to Admin/components or similar. This allows to use the
> Isabelle history in the expected way, e.g. bisect over a certain range
> with the
> right components being used automatically. No restriction to "latest
> this, latest that", with implicit meaning of "latest".
Agreed. I abandoned the largefiles idea already myself by now.
> Artifactory seems to be the "solves all your problems solution",> but it
> is also quite large. See also http://www.jfrog.com/products.php
> Here is a live demo
> This means all this infrastructure offers a plain hierarchic web
> download in the end -- no need to use Maven. A modest wget script can
> still download components as plain URLs. On the server side it is a bit
> more than just a modest python script, though ...
It is complete overkill, IMHO. And these Java-based servers based on web
containers (Jetty et al.) do not always play nicely in a Unix
environment, e.g., shutting them down does not always work reliably,
etc. I think we should build something with standard Unix tools, instead
of relying on another server from a different universe.
So here is my latest low-tech proposal:
* /home/isabelle/components is the components repository, where all
components are stored. They are stored as tarballs.
* The existing php script can be used to serve this directory via HTTP.
* Non-free components are marked as such simply via file permissions,
i.e., by having the world-readable flag unset. Since Apache runs under
group "isabelle", we might have to set the group to something else
(e.g., an imagined "isabelle-admin").
* Integrity of this directory is ensured by a cron job which compares
the output of "sha1sum /home/isabelle/components/*" with a file
in the Admin section of the Isabelle repository. So we can easily detect
accidents (and revert them, possibly with the help of the standard
backups). Such a script is easy to write, and I already have
some fragments lying around here.
* /home/isabelle/contrib is maintained automatically by unpacking the
contents of the tarballs (and setting permissions properly).
* A similar script in Admin/ can download components via HTTP and link
them into a clone of the Isabelle repository.
I would say that 30 lines of bash will do. And additional 30 lines of a
README, which goes into the same directory.
More information about the isabelle-dev