From Mike Luther@1:117/3001 to Jonathan de Boyne Pollard on Sun Aug 26 03:11:14 2001
I think, Johnathan, from what I've learned so far, that a part of what you've offered is correct.
JdBP> 1. Have some utility run on the remote end that creates a file
JdBP> containing the set of (binary) differences between
JdBP> two successive versions of the database file. Have
JdBP> a utility on the local end that takes such a
JdBP> "differences" file and a database and applies the
JdBP> former to the latter. Compress just the differences
JdBP> file for downloading. This is an analogue of the
JdBP> approach used by many large "open source" software
JdBP> projects, where instead of compressed snapshots of
JdBP> the entire source tree one can download compressed
JdBP> differences between one version and the next.
There appear to be two basic granularities of the project. One relates to perhaps hundreds of thousands of 'things' that are common across the entreprise. Even at that, what is suggested above, with modern equipment,is, I
JdBP> 2. Increase the granularity of the compressed bundle. Rather than
JdBP> having BUNDLE.ZIP containing A.TXT, B.TXT, and
JdBP> C.TXT, have A.ZIP, B.ZIP, and C.ZIP each containing
JdBP> a single file. Download each individual archive
JdBP> file if its datestamp indicates that it has been
JdBP> updated. This approach presumes that, as your
JdBP> original message implied, your database is a set of
JdBP> multiple individual files rather than a single
JdBP> humungous BUNDLE.DAT file with a complex internal
And the above is exactly what is needed, simply for storage needs.
I'm thinking carefully on how it may be possible to implement the above.
The other issue I face is, I think, more complicated.
A given box generates perhaps a hundred transactions a day. The field of action in the market spans, even at present, over a million and a half boxes. Thus the transaction processing load which has to be handled and stored on a daily basis is .. realistically, over a million transaction processing events a
day. Each of them has to be recoverable and massaged as to what the disposition of them was, at Empire Central, at least ONCE at the originating embedded boxlette end. From that point on any one of them actually has to be recoverable, on demand, for 50 years or longer, in some cases we already know about.
Mike @ 1:117/3001
--- Maximus/2 3.01
* Origin: Ziplog Public Port (1:117/3001)
From Francois Thunus@1:1/0 to Mike Luther on Wed Aug 29 00:53:01 2001
19 Aug 01 08:10, Mike Luther wrote to Jonathan de Boyne Pollard:
Collaberative data management was one of the high points of what Lotus
was intended to be all about. However, when you postulate Lotus for thousands and thousands of boxes, the cost becomes a serious concern.
There is something doing what I think you are looking for, and for free: rsync. In my company we use Linux internally, and we use rsync for all updates across the network. I have no idea whether this is an option for you, but you can always download the source code for rsync and try to recompile it for your platform of choice (considering the echo, I'd say OS/2, in which case you have a fair chance with emx).