Frederik Ramm, Die, 27 Feb 2001 :
> > When scanning the first page, half the data goes in this large buffer,
> > then when the frontend asks for another page it comes out of the
> > buffer very quickly. The buffer can be discarded when the frontend
> > calls ..._close() so it won't take up RAM all the time.
> That sounds reasonable. But can I be sure that front-ends will support
> reading multiple pages one after the other with no call to ..._close()
> in between? This would not, for example, work with "scanimage", would
You can't be sure, but you can be quite sure that at least the graphical
frontends won't call sane_close() between scans.
On the whole, it's just an insufficiency in the current SANE Standard.
Probably a simple
in SANE_Parameters would be sufficient. If another_scan_follows is true
and last_frame is true, the frontend knows, that the next frame belongs to a
new image. This, of course, will only fly if the backend stores the
following frames in RAM, like suggested by Nick Lamb.
Another advantage is, that this doesn't break compatibilty with
current frontends. They just read one image and call sane_cancel(); now
the backend knows, that it can discard the following frames.
-- Source code, list archive, and docs: http://www.mostang.com/sane/ To unsubscribe: echo unsubscribe sane-devel | mail firstname.lastname@example.org
This archive was generated by hypermail 2b29 : Tue Feb 27 2001 - 11:34:11 PST