HALLOGRAM PUBLISHING
SHOPPING CARTSITE MAPCONTACT USPRODUCTS
HOMEBARCODESDEVELOPER TOOLSUSER TOOLSTRAINING
WebCab Optimization for .NET
WebCab Optimization for .NET
WebCab Optimization for .NET

Product Details

This suite includes the following features:


  • Local unidimensional optimization - finds global minima / maxima for continuous functions in one dimension

    • Fast `low level' algorithms - use these algorithms when your primary concern is the speed and not the accuracy of the results. You will have to chose one bracketing algorithm and one locate algorithm (note, they are useful only in pairs). Also you will have to manually provide a lot of parameters (tolerance, maximum cycles etc) which can dramatically change the algorithm performance

      • Bracketing algorithms - these methods find an interval where at least one extrema of a continuous function exists

        • Acceleration bracketing - this method can be used with any continuous functions
        • Parabolic extrapolation bracketing - gives better results than acceleration bracketing for a large class of functions (functions that are locally parabolic about the extrema)
        • Acceleration bracketing for derivable functions - requires derivatives to be known; it's slower than the general acceleration algorithm but also safer

      • Locate algorithms - these methods converge to the extrema if the extrema is bracketed and the function under consideration is continuous

        • Parabolic interpolation locate - very fast algorithm but with moderate accuracy
        • Linear locate - slow algorithm but exhibits stable convergence
        • Brent locate - medium speed with good accuracy. With a good balance of speed and accuracy, this algorithm is very efficient to use
        • Cubic interpolation locate - very fast algorithm with reasonable accuracy; requires the derivatives to be known}
        • Brent method for derivable functions - medium speed and good accuracy but requires derivatives to be known


    • Accurate `high level' algorithms - these algorithms are easy to use and offer high accuracy but are also very slow compared with the `low 'level' algorithms above (1,000 to 10,000 times slower). Use these algorithms when you need reliable results. The probability for a `high level' algorithm to make a mistake is much less than that of `low level' algorithms.

      • Method for continuous functions
      • Method for derivable functions


  • Global unidimensional optimization - finds global minima / maxima.

    • Methods for continuous functions
    • Methods for derivable functions

  • Unconstrained local multidimensional optimization

    • Methods for general functions - these algorithms do not require continuous functions

      • The downhill simplex method of Nelder and Mead - minimizes the function over a sequence of equal volume simplexes

    • Methods for continuous functions - these algorithms require the function to be continuous

      • Conjugate direction algorithms - this algorithm searches by iterating along conjugate paths

        • Powell's method - an implementation of the conjugate direction algorithm


    • Methods for derivable functions - these algorithms require the gradient of the function to be known

      • Steepest descent - a classical method with poor results, this method should mainly be used for testing purposes
      • Conjugate gradient algorithms - speed and accuracy highly dependent on the particular function, these methods can be deceived by `valleys' in the N-dimensional space

        • Fletcher-Reeves - an implementation of the conjugate gradient method
        • Polak-Riviere - an implementation of the conjugate gradient method

      • Variable metric algorithms/Quasi-Newton algorithms - slow speed; good results on a large class of continuous functions. The basic idea is to find the sequence of matrices which converges to the inverse Hessian of the function.

        • Fletcher-Powell - an implementation of the variable metric algorithm
        • Broyden-Fletcher-Goldfarb-Shanno - an implementation of the variable metric algorithm



  • Unconstrained global multidimensional optimization

    • Simulated annealing - a technique that has attracted significant attention as suitable for optimizing problems of large scale, especially ones where a desired global extremum is hidden among many poorer, local extrema

  • Constrained optimization for derivable functions with linear constraints

    • Rosen's gradient projection algorithm - uses the Kuhn-Tucker conditions as a termination criteria.

  • Linear programming - here the functions are linear and the constraints are linear

    • The simplex algorithm - Kuenzi, Tszchach and Zehnder implementation of the simplex algorithm for linear programming


WebCab Optimization for .NET Main


WebCab Optimization for .NET is electronically deliverd.
WebCab Optimization for .NET
1 Developer License $179
4 Developer License $299
1 Site License $599


Home || Shopping Cart || Site Map || Newsletter/Blog
Search HALLoGRAM || Request More Information
CALL TOLL FREE 1-866-340-3404

SEARCH
14,500+ PRODUCTS:

Order WebCab Optimization for .NET

SIMILAR PRODUCTS
  • Math and Financial Libraries
  • End User Business Products

  • OTHER PRODUCTS
  • WebCab Portfolio J2SE
  • WebCab Portfolio for .NET
  • PRODUCTS FOR
  • MS-Access
  • ActiveX/OCX Controls
  • C#
  • C/C++
  • Clipper
  • dBASE
  • Delphi
  • FoxPro/VFP
  • Java
  • Macintosh
  • .NET
  • Oracle
  • Paradox
  • PowerBuilder
  • Visual Basic
  • Visual Objects

  • Keystone Training Videos

  • Copyright �2004 HALLoGRAM Publishing, Aurora CO. All Rights Reserved
    All products mentioned in this site are trademarks of their respective owners
    Prices are subject to change without notice
    caksgkim