[Koha-devel] A Discussion on A Policy Setting Forth Standards of Code Submission, etc. [WAS: RFCs for 3.4 from BibLibre (serials & acquisitions)]

Joe Atzberger ohiocore at gmail.com
Tue Nov 9 20:39:12 CET 2010


> > Regarding the simplicity of signing off, I take some issue.  It is
> > *severely* non-trivial to test major integration features.  Consider SIP2
> > and LDAP, or something like EDI.  It can depend not just on accurate test
> > data, but entire servers, network environments, remote accounts granted
> by a
> > supplier, foreign domain/language knowledge, etc.  Sure, I'd love it for
> > everybody to have a dozen signoffs.  I just think blocking code while
> > waiting on a 3rd party (who by design is disinterested) to come around
> and
> > dedicate some resources is a questionable policy.
>
> I'm sure there are any number of features which could be tested in
> very complex environments, and perhaps even more complex than those
> used or anticipated by the original developers themselves. I wonder if
> the original supplier of the SIP2 and LDAP features actually went to
> the level of testing you describe prior to committing those features.
>

Since that was me in both cases, I can tell you: yes for SIP2, not so much
for LDAP.  Initially LDAP was written and tested against openldap, without
any access to an Active Directory server, and it was a bitch.  (Openldap has
this horrible behavior of crashing the daemon completely if you give it a
malformed command-line query or attempt to insert/edit a record
unsuccessfully.  It also simultaneously corrupts the data that is stored in
compiled B-trees.  Apparently performance was key, not reliability.)  Only
later did I test between VMs using Sun's OpenDirectory and remotely to
ActiveDirectory.

SIP2 testing was fairly robust, but required an extreme amount of data
tuning to make it possible.  I.E., a test requires that a patron have a $22
fine and an overdue item, so you have to make a user have a $22 fine and an
overdue item.  And then you cannot run overdue fines again.  Ever. So
basically that requires a dedicated instance.


> > Forgive me if I'm off the pulse a bit, but do these expectations exist
> > today?  The release process establishes when new features are accepted or
> > not, and it has been pretty explicit and clear.  The problem used to be
> big
> > unilateral changes that weren't getting submitted (including some code
> that
> > I wrote).  Now the problem is they're getting submitted?
> >
>
> Perhaps this point bears greater clarification. By "expect it to be pushed"
> I meant particularly without any prior discussion/RFC/community
> participation. What was/is going on with LibLime/PTFS LEK is a classic
> example of the sort of thing which needs to be discouraged. I refer
> particularly to the process. PTFS has stated in a number of forums that once
> their "contract" obligations are met, they will submit this code to the
> community. However, the job is done at that point and clients will have
> implemented the product. There will be a certain level of "expectation" that
> "new" features "should" be pushed to the main code base.


On the plus side, this expectation is what should drive the vendor to submit
the code in its most-likely-to-be-accepted form.  In reality, the code that
gets kept in the closet for a year or more *will* lose out.  A vendor or
their client may feel like they get a competitive advantage by deferring
submission, but this is illusory.  It just backloads and complicates the
otherwise critical work of getting patches into mainline.  While the patches
are sitting around getting stale, if the feature is at all desirable, other
people are working on similar or competing versions, and then extending the
published version, bugfixing and documenting it.  The work you do to extend
or revise the withheld feature is quite possibly wasted effort.  Resolving
the competing implementations is often more work than it was to write either
one of them, and could have been avoided entirely with earlier publication.

Getting your patches in master is a *defensive* position that establishes
your data model, API and presentation as accepted.  But we can preach all
day on this.  Let's try not to.



> In spite of this, I hope we can avoid becoming vendor-centric.
> "Vendor-centricity" leads to vendor dominance and control ultimately.

I am not personally aware of a vendor controlled FOSS project that
> does not lean heavily in that vendor's favor.


I think you conflate the fact that vendors are primary players with the
outcome of a single-vendor control.  A healthy FOSS project has many
players.  Whether they come from different commercial interests, or
different user bases, it doesn't really matter.


> The problem is not one of where to "relegate" vendors in the "social"
> structure of the community. Rather it is all about keeping vendors who
> have relatively limitless resources from holding the controlling
> interest in the community ...
>

Your impression of a vendor's available resources is a bit fantastical.
 Having worked at both LibLime and Equinox, I can tell you neither company
had more than 40 employees, with only a minority in development.  If LibLime
had such boundless resources, they would not have had to sell to PTFS.  And
PTFS, in terms of resources relevant to Koha, is not applying a team even
half that size.  Compared to clients like King County (WA), WALDO or GPLS,
the operational capacity and physical resources of vendors (even PTFS) is
scant.


> >> This may not be the view of all involved. However, if it were not for
> >> Koha, Koha support vendors would be out of some amount of business.
> >
> > And without vendors *no* version of Koha would ever have been written or
> > released.  It's sorta funny that I'm the one saying this stuff, as I am
> not
> > currently affiliated with any vendor.
>
> I think that 1.0 was written by coders for hire (aka Katipo) and was
> released by, not a vendor, but HLT, a library. So, in fact, no vendor
> in the technical understanding was involved. Chris C. can correct me
> if I'm wrong here.
>

Katipo was and is a vendor.  They were not users of the software themselves.
 Instead they sold software development and services to HLT.  This is the
technical and practical definition of a vendor, and it is exactly the same
as what other Koha development shops do, except Katipo was starting from
scratch instead of an existing version.  I'm not sure what additional
indicators of vendorness you might be looking for.  Perhaps an evil
moustache?  : })

--Joe
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </pipermail/koha-devel/attachments/20101109/cdb9897a/attachment-0001.htm>


More information about the Koha-devel mailing list