The Crawler-Lib Engine is a general purpose workflow enabled task processor. It has evolved from a web crawler over data mining and information retrieval. It is throughput optimized and can perform thousands of tasks per second on standard hardware. Due to its workflow capabilities it allows to structure and parallelize even complex kind of work. Please visit the project page for the complete view of the Crawler-Lib Engine.
A license for the Anonymous Edition is included in the package. A license for the more powerful free Community Edition can be generated on the project page.
See the version list below for details.
* Added additional constructors to several workflow elements. So you can construct and use them without specifying a complete configuration object for the element.
* Added AwaitProcessingEnum awaitProcessing to several workflow element constructors, so you can specify that the continuation will be called on failure and check the Success property to decide what to do.
* The workflow elements are awaitable since this release.
* New workflow elements for limits and operation cost calculation have been added.
* A vast amount of small extensions and refactoring
Version 2.00 -2.01 -2.02:
* First public releases