| home products news downloads documentation support gallery online maps resellers search | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|
Loading a full installation of TNTmips 5.5 onto your hard drive (exclusive of any other products, data sets, illustrations, Word files, etc.) requires the following storage space in megabytes.
V5.50 of the illustrations for the on-line documentation requires an additional 26 megabytes. Installing all the sample geodata sets for TNTlite and TNTmips requires an additional 61 megabytes. V5.50 of the TNT products for the DEC Ultrix, IBM PowerRISC RS/6000, and the Data General Aviion platforms is available upon special request for which a special CD will be produced. If you did not order an upgrade of your TNT professional product, and wish to do so now, please contact MicroImages by FAX, phone, or email to arrange to purchase your quarterly upgrade to V5.50. Upon receipt of your order and processing, MicroImages will supply you with an authorization code by return FAX only. Entering this code when running the installation process allows you to complete the installation and immediately start to use TNTmips 5.50 and the other TNT professional products. If you do not have an annual subscription to TNTmips, you can purchase V5.50 under the elective upgrade plan at the cost in the tables below. Please remember that new features have been added to TNTmips each quarter. Thus, the more quarters you are behind V5.50, the higher your upgrade cost, up to a fixed limit. Upgrades from all previous versions of MIPS and TNTmips 5.0 or earlier are the same, fixed cost shown below. As usual, there is no additional charge for the upgrade of your special peripheral support features, TNTlink, or TNTsdk which you may have added to your basic TNTmips system. Within the NAFTA point-of-use area (Canada, U.S., and Mexico):
For a point-of-use in all other nations:
TNTview ® 5.5 The following is a summary of the new features added to the TNT products which are now available in TNTview. Detailed descriptions of these new features can be found in the appropriate section below on TNTmips.
Within the NAFTA point-of-use area (Canada, U.S., and Mexico):
For a point-of-use in all other nations:
TNTatlas TM 5.5 The navigator window has been updated. It is now smaller and simpler. The attached color plate entitled HyperIndex Navigator Update illustrates these improvements. Detailed descriptions of these new features added to TNTatlas can be found in the appropriate section below on TNTmips. Snapshot color printing using Windows printer drivers was provided in V5.40. The quality of these color prints has been improved. P3 and P5 printer support is now included without cost in TNTmips, TNTview, and TNTlite. Thus, complex layout and color printing at 8.5 by 11" is available via the Map and Poster Layout process in these products. Within the NAFTA point-of-use area (Canada, U.S., and Mexico):
For a point-of-use in all other nations:
TNTatlas TM sampler of San Francisco The Prototype3 of the sample San Francisco Bay Area TNTatlas is finally enclosed. New features available in this version relative to those in Prototype2 are as follows.
This sample TNTatlas is no longer free. Once the information that something is "FREE" begins to circulate on the Internet, it is impossible to sort out truly interested parties from tourists. Earlier this year, information about this FREE CD was posted on the Internet on a FREE CD list. As a result, over 300 requests from tourists were received in a two day period before the posting could be removed. Additional Prototype3 CDs can be purchased for use or distribution on the same price schedule as TNTlite kits. Prototype3 will not be included within the detailed promotional packages distributed by MicroImages to prospective clients. TNTlite TM 5.5 Approximately 5000 TNTlite kits have been distributed. A copy of the standard V5.50 of the "A" or "B" CD will be shipped to each party who has registered their copy of the V5.40 CD. These copies will be shipped when all shipments are completed to professional clients. A sampling of the testimonial letters at the end of this MEMO indicate the kind of feedback being received. The following is a summary of the new features added to the TNT products to improve their usability in general and for TNTlite in particular. Detailed descriptions of these new features can be found in the appropriate section on TNTmips.
The size of the documentation has expanded this quarter to a total of 2546 single spaced pages. Last minute supplemental sections which do not occur in the on-line documentation or Microsoft Word versions were created for new processes and features. These sections were completed for V5.50 after the master CDs were created for the reproduction process. These 169 additional pages are included in supplemental, printed form as follows.
New TNTmips Application Features * Paragraphs or main sections preceded by this symbol "*" introduce significant new processes, or features in existing processes, released for the first time in TNTmips 5.5. > Paragraphs or main sections preceded by this symbol ">" introduce modifications in the TNT professional products which have additional, special significance to the users of the TNTlite products. The following section (Multi-user Improvements) concerns only those TNT installations where multiple clients are locally or remotely via network sharing a TNT product. Other than the transparent alterations in the *.INI reference file structure, these changes to improve the support of multi-user installations do not affect single computer, single client installations of the TNT products. Multi-user Improvements. It is now possible to use MicroImages' public
domain MI/X server on any Mac, PMac, W95 or NT based
microcomputer as an X terminal to connect to, and remotely use, any TNT
product (or other X based product such as an ARC/INFO) operating on an NT3.51,
NT4, or UNIX platform). For example, the MI/X server can be
used to remotely access and operate a multi-user ARC/INFO system (ESRI
does not provide an X server for this purpose). To support this, each X
client on a multi-user system can now have their own private preferences
file for TNT products. This file is stored in the X client machine's
home directory and is available for both local and remote use. IMPORTANT: MicroImages' clients
already have all the MI/X servers on the dual set of CDs
distributed each quarter for all platforms. Anyone else worldwide who wishes can
use MicroImages' MI/X servers without
charge. These free X servers can be installed on any Windows or Mac
platform to access and control ARC/INFO and other such UNIX products. To
acquire these X servers, simply contact www.microimages.com, find the "FREE
STUFF" page, execute the simple download
procedure available, and subsequently use the installation procedure copied to
your drive. Multi-User Setup (*.INI file redesign): The functions that read and write the INI reference files used by the TNT products have been re-written to allow storage of user default preferences like map projections, XY digitizer control points, extents, and others parameters which will be added later. The structure of the INI files has changed to facilitate the need for separate and different user preferences on multi-user platforms like NT and UNIX. Another feature of the new INI files is that they are written in UTF8 encoding, which the TNT text editor will handle. This allows storage of UNICODE strings in an ASCII file which can be used to encode these INI files in other languages. The files tntmips.ini, tntview.ini, and tntatlas.ini are now obsolete, but tntmips.ini is still used if these new tntproc.ini, tnthost.ini, or tntserv.ini cannot be found for backward computability. The new files are as follows. tntproc.ini (user preference file). This "proc" (process) control file contains those parameters which describe the characteristics and preferences of an individual client with access to a TNT product. The bulk of V5.40 TNTmips.ini has been moved into this new TNTproc.ini file except for those items transferred into two new files as described below. The TNTproc.ini file goes into the home directory of each client on their home platform. tnthost.ini (platform settings file). This "host" control file will be placed into the same directory with the TNT executables and contains those parameters which define the total TNT system characteristics and preferences set in common for all clients. Anything that needs to be set only once, is platform specific, and deals with platform resource administration will go in this file. It contains the [KEY], [RVC], and [DIGITIZERS] system parameter definition groups. It also has three fields from the [Files] group: 'TNTpath', 'FontPath', and 'TempFile'. It is recommended that this file be set to be read-only for all clients by the system administrator. Under this read-only condition, any input position in TNTmips where these parameters could be changed will be grayed out, but their current settings will still be shown. tntserv.ini (MI/X preferences file). This "serv" (X server) control file contains the parameters which record the individual client's personal characteristic and preferences on the look and feel of MI/X. It is now separated from the individual's TNT preferences as the MI/X server may be installed for use with some other UNIX software such as ARC/INFO. Most of the [XSERVER] group of parameters are in this file as well as 'WManRCFile' from the [Files] section. This file can either be placed with the executables or as a user preference in their home directory. Only this file is needed on the local machine to define the local MI/X server's parameters when it is used with other X software such as ARC/INFO. These INI files can also be relocated in a variety of places. For example, the local client will set up the local machine and its home directory to suit themselves. MI/X and TNT will search for these files in the following order:
In summary, the following are the proper places for the new INI files:
tntserv.ini in the home directory if available, tnthost.ini in the directory with the TNT executables, and tntproc.ini in the home directory if available.
on the client side, tntserv.ini should be in the client's home directory, on the server side, tnthost goes in the directory with the executables, and tntproc.ini should be in the home directory if available. Miscellaneous. Metadata. There is now a "Metadata..." button on the various file/object selection dialogs in the TNT products. Metadata is "data about the dataset" and geospatial data in particular to assist anyone who may acquire and attempt to use that geodata set. It can contain a wide variety of information in any text format, language, and alphabet (including 2-byte languages such as Japanese, Chinese, Korean ...). For example, the Metadata could describe all the steps taken in the creation of the geodata and be as formal as a log file; all positional accuracy information even relating back to field or GPS surveys; and any other kind of descriptive materials added to this file as its subsequent users modify and edit it. Metadata files usually accompany the primary geodata file(s) as a separate descriptive file(s). There are currently no widely accepted standards or conventions as to the names or extensions of Metadata file(s), what they should contain, and how that material should be formatted. A U.S. Government committee has attempted to describe how these should be handled, but the result is still vague and not widely adopted by commercial systems. But, the Metadata idea is still useful and needed. Most commercial geodata based systems which provide for creating Metadata keep it in a separate file. By experience, their clients learn that it is easy to delete or mislay that file along the way. For example, a subsequent user fails to "download" the Metadata file, and then several users later, its very existence is unknown. MicroImages has always promoted the idea of a Project File container into which materials are assembled and managed--as contrasted to a loose assembly of possibly related files or geodata coverages. One Metadata subobject can now be created for each of the primary geodata objects in a Project File (raster, vector, CAD, and TIN). Each of these geodata objects has always had a short name and long descriptive title. Now each can have an imbedded metadata text description containing anything desired. TNT Metadata for any geodata object can be created and altered by a new Metadata editor. This editor can be accessed from any file/object selection dialog via the new "Metadata..." button. This button allows viewing and editing of the metadata subobject for the selected object. It uses the TNT standard multilingual text editor already familiar from its many other uses in the TNT products. Metadata can also be created by any other means (e.g. externally), transferred into a TNT text object, and then inserted into a new or existing Metadata subobject using the Metadata editor. Future versions of the TNT products may have the ability to generate a Metadata template based on the U.S. Government Metadata standard (SDST) depending upon the international acceptance and use of this kind of information and format. Easy Unlock. Automatic locking of RVC files can be a nuisance for those with old, low performance desk top PCs. But, many new, modern Windows based PCs and workstations are now commonly used to run multiple versions of TNTmips or multiple processes. These platforms are also more commonly part of a shared, network operation where multiple users make Project File locking absolutely imperative. Those clients operating in this power-tool fashion clearly understand the need for such access locking. For further information on how this works, the V5.50 Grapevine MEMO presents a technical exchange between ARC/INFO users on the topic. In this exchange, they discuss in detail the difficulties and loss of coverages which can occur since access locking is not automatically provided in ARC/INFO. This inherent design flaw in ARC/INFO is also the basis for several of the "Thou Shalt Not ..." commandments presented in the special V5.50 MEMO entitled The Commandments of ARC/INFO Users. Okay, so by now most clients are beginning to accept the need for such a lock/unlock control scheme in the multiprocessing environment of today. But, that still doesn't make it any less of a nuisance in a single user, single processor approach limited by the use of old Windows 3.1 or on a single non-networked Mac or PC. There is also the situation where a lock is left behind when any process is killed, such as by user action or interruption (e.g. by shutting off the machine, some other non-TNT process hanging the machine, and ...), or by the failure of a TNT operation. V5.40 and earlier required that the manual unlock be accomplished by opening another process: Project File Maintenance, and then executing several other steps. This was inconvenient, especially for a newcomer who had not yet had this unpleasant experience. V5.50 provides a direct method to unlock any locked RVC file wherever it is selected. Now if a locked file/object is encountered, the Message window will automatically appear if the preference is set for it to do so (Support/Preferences, "Show Unlock in Locked File" button toggled on). It provides information with regard to what process locked the file, what machine, when it was locked, and the serial number of the TNT product which locked it. Based upon this information, a decision can be made to unlock the file using the Unlock button provided for this purpose. If the Unlock button is presented and used, this unlock operation will have to be confirmed a further time by a Yes or No choice in a Verify window which is automatically presented. Be careful using the unlock choice in the Message window, as the lock may be due to the file being in use and about to be altered by someone else or some other local process. Under these circumstances, an unlock action and the subsequent attempt at shared use can be fatal to the integrity of that Project File! On a single user system, the information provided in the Message window can be used to make a determination as to how and where the lock was set. The time of lock, process which set the lock, and the other information displayed provide the basis for determining if the lock is legitimate and placed by some active process or simply an artifact of some previous incorrectly exited activity. This "easy out" unlock option has required that additional system administration options be provided for the TNT systems with multiple clients. These systems' managers can determine if individual clients have the privilege of unlocking any locked Project Files. If the file is read-only, the unlock button will never occur for any client. If the file can be opened for writing, but an individual client does not have permission to write to, or delete the locked file, then the 'unlock' button is grayed out for that specific client. This preference value is stored in TNThost.ini. See the structure in this INI file for more information. This is one example of the use of the new INI reference file structure explained above. It allows control of individual user's rights and privileges where needed but is transparent on a private client system. * Tab Panels. A new kind of user interface gadget called tab panels has been introduced. Various forms of these panels are in wide use in W31, W95, and NT products. For example, tab panels are used in Microsoft Word to present many parallel optional dialogs (e.g. see Options or Macro on Word menus). Tab panels function just like tabs on files in a file cabinet. They present all the files in the drawer one at a time for independent selection of their contents. Tab panels present and manage a selection of independent, parallel dialog and other user interface components. Their use reduces the clutter and delays of opening and moving around a number of dialog boxes to get to those providing the inputs for a particular subsection of a process. Tab panels are being introduced in TNTmips in the Polygon Fitting, Color Binarization, Raster Spatial Filtering, and the new Surface Modeling processes. They are also used in the TNT Object Editor process in the new subsection dealing with Region of Interest generation. In these places, they appear as a rectangular panel in a dialog box that has an upper layer of one to five tabs. Each tab is connected with a different dialog panel. Only the one panel with the active tab is visible. The left mouse button can be used to select another tab and bring its contents and options forward into the active panel. Miscellaneous. Anywhere you can specify the format for the database field containing latitude or longitude, you can now use the format DDDMMSSss. This format represents integer degrees appended with integer minutes appended with integer seconds carried out to two decimal points. For example, if the field contains 359595999 this will be translated as 359 degrees, 59 minutes, and 59.99 seconds. The DDDdddddd format has also been added. This DDDdddddd format is how the latitude and longitude fields occur in every record in the EASI Database CD (see enclosed advertisement for its availability). This new format will allow direct attachment, pin mapping, and other TNT uses of the extensive U.S. demographic data on this CD. DDDdddddd represents degrees carried out to six decimal places. For example, if the field contains 359999999 it will be translated to 359.999999 degrees. The Line/Polygon graphic tool now defaults to the last drawing mode used (draw or stretch mode). Element Selection Dialog. Powerful Element Selection dialog is a system feature common to both the Display and Object Editor processes. The following features have been added or altered in this dialog. Additional details on the new features in this dialog can be found in the supplemental documentation section enclosed entitled: The Examine Attributes Interface. Popdown Icon Menus. More and more icons were needed in long horizontal lines in this dialog. This required that the dialog box increase in width, thus taking up more screen space and becoming too large for the 640 by 480 pixel portables. As a result, another new type of TNT interface component called a "popdown icon menu" has been added. A popdown icon menu is exactly what the name implies, a menu or list which pops down from an icon when it is selected. Push the Select All/Deselect All icon in the horizontal line of icons in the detailed Element Selection dialog in the Display process and immediately see a menu appear. Select one of these menu choices with the left mouse button to perform its action. They've Moved. The 'Deselect All ', 'Invert Selected ' icons and the push button 'Deselect All Elements' for vector and TIN have been placed under the Select All/Deselect All icon. This popdown menu will appear to present these options when this new icon is selected. The 'Make Form...' operation is in the popdown menu under the 'Make Table' icon. Selection of elements via a database query was available in the Object Editor, but now is also in the "Element Selection" dialog. This feature, along with the Previous/Next button features described below, add into the Display process the "Pan By Query" feature previously only available in the Object Editor. The Previous and Next icons previously in the 'Active Element Information' section have been moved to the element type row. * Pan by Query. The Previous and Next icons operate on the set of selected elements of a specific element type and layer. In V5.40 they were in the 'Active Element Information' section and have been moved to each element type row in V5.50. Use these icons to switch which element type (line, point, polygon, or ...) is active in each layer. When the end or beginning of the selected set is reached, a dialog box will open with a notice to this effect. If the view is zoomed in, pressing a Previous or Next icon will pan the display over at that same zoom to that element which is active. This, along with the "canned" query selection feature, reintroduces into display the very powerful "Pan By Query" capability previously available only in DOS MIPS. Database Forms. This feature was introduced in TNTmips 5.2. It is being reintroduced in V5.50 as part of the table functions available in the "Element Selection" dialog. In this new manifestation via the icon in the dialog, a database form can be viewed, created, renamed, or deleted. The rename and delete functions are accessed by pressing the left mouse button with the cursor positioned on the line that the form is on. The button to make a form has been combined into the 'make table' popdown icon menu (see section above on popdown icon menus). Region-of-Interest (ROI). Selecting elements via a Region-of-Interest is now available for vector, CAD and TIN geodata objects. Please see the section below for detailed information on Region-of-Interest. Inside or Outside? The area selection tools now provide the ability to test whether the element is inside or outside the area and whether or not the element is completely or partially inside or outside the area. This area of interest test applies to testing areas defined by Regions-of-Interest as well. This process has been renamed "Display/Spatial Data..." to adjust for the incorporation of the new 3-D perspective and other planned features. The old 3-D process available in V5.40 and earlier has now been replaced by a new stand alone version of the 3-D perspective procedure incorporated into the Spatial Data Display process. > Making it Simpler. It has always been MicroImages' goal to continually simplify how the TNT products and processes are first used and make this start-up as intuitive as possible. Carefully crafted interfaces can be simple to start to use while allowing progressively more complex applications to be undertaken when and if desired. Beginning clients have occasionally had a problem understanding where to start in the Display Control window. It is not always intuitive to make the first choice (layers) from the middle of a window. However, since considerable money had been invested, professional clients quickly figure it out on their own, check the manual, or occasionally contact MicroImages. Now feedback from some of the more than 5000 who have obtained TNTlite has resulted in improvements for all beginners. From these potential "students of geospatial analysis" we have learned that they are not as tolerant as professionals. Their TNTlite was free or nominally free. As a result, some gave up easily if they could not quickly get an "image" of some kind up on the screen. They had nothing invested which had to be justified by "working at it" more than a few minutes. Quickly creating their first view is a benchmark which helps keep them interested. Powerful and similar display procedures are provided in all TNT products and most processes. Improving them by making them simpler to use benefits everyone. Rearranging the Layer Control window and providing for automatic layer display have been added as part of the continuing efforts toward this objective. A color plate entitled A New Look for Spatial Data Display is attached to illustrate this change. The changes are also described in more detail below and in the enclosed new manualette entitled Getting Started in Displaying Geospatial Data. It is also becoming clear from the wide public acceptance of Windows 95 and MacOS, that the DOS or other similar cryptic drive hierarchy structures and navigation methods may no longer be widely understood. Thus a newcomer to the TNT products no longer understands this DOS kind of hierarchical approach from the use of other commercial windows products. These older file access concepts are still used in the TNT products in the selection of directories, Project Files, objects, and subobjects. Thus, another future challenge for MicroImages is an improved interface to allow easier, graphical location and selection of project files and objects. Simplified Display Control Window. The Display Control window has been redesigned to provide a simpler and more intuitive appearance and operation. Since the "Group" and "View" panels are used less, the "Layer" panel has been moved to the top. The default order from top to bottom for these panels is now layer-group-view. This places the Quick-Add layer icon in the upper left corner of this window where it is the most easily located. Also, the default exposure of this window only shows the layer icons bar and panel. The group and view panels are now closed but can be toggled open using buttons at the bottom of the layer panel. The group and view panels will also open automatically if a group or view is created by a subsequent activity. These alterations present a simpler and smaller Display Control window for initial and simple construction of a view. The above is the new default condition for this window and can be changed using the new View Options dialog box accessed by the "Options" item on the View menu in the Display Control window. The panels will be open or closed as left from the last use. The order of the panels can be changed to view-group-layer as in V5.40. The scrolled lists which appear in each panel will now automatically increase in size as more layers, groups, and views are added to them. This makes working with complicated layouts involving many groups and layers easier while keeping the window small until additional entries are required. The maximum number of visible lines of information in these panels is also setable. Use this option so that the total height of the Layer Control window can be forced to stay in the vertical reach of a 640 by 480 pixel portable's screen or expanded for a larger display device. Instant Layers. The current view will be automatically updated when layers or groups are added, removed, or changed. Locating and selecting the redraw button is no longer required in the new default mode. This change is in response to numerous questions from TNTlite students about how to get the objects to draw after they are selected. Navigating to and selecting a geodata object is complicated enough for beginners to grasp and will need to be the focus of some future simplification. But at least now if the object is selected, it will automatically be displayed, providing nearly instant gratification. However, this is an example of one of the many areas in which professional, experienced clients will not find this change efficient. Why, because when ten layers are selected to set up a group and a view, the first one selected will be updated. Selecting the second will require that the first and then the second be updated, and so on. The previous situation which required all layers be selected before any "redraw" is obviously more efficient as experience is gained. In order to please everyone, automatic or manual redraw can be selected. Automatic redraw is the default. To change to manual redraw via the Redraw icon, change the mode via View/Options which will expose a new view control Options dialog box. There is also an Auto-Redraw toggle button in the Group Controls window. The view will be automatically redrawn with this button in the default position whenever a change is made in the Group Controls window or with the Placement tool. The automatic drawing of "hidden" layers when "unhidden" may now be turned off also in this dialog. These and several other "expert versus beginning user" options can be toggled on/off or set in the new View Options dialog box. New View Options dialog box. The View/Options... dialog box has been added to allow reconfigurations of the View default options from beginner to expert. It can be exposed for use by selecting the "Options" item on the View menu. The default settings are those judged to be the most suitable to the first time or casual user of the TNT products. However, experienced users may find their tasks are more efficiently accomplished by using other modes of display operations. * Improved Locator Window. All layer types may now be displayed in the locator window (i.e. vector, CAD, and TIN in addition to rasters). For layer types other than rasters, this snapshot view must be activated for each layer via the menu in the Layer Control window via Layer/Show In Locator. This requirement is imposed as it might take too long to display very large objects of these types into this small locator window, and its postage stamp view would be unrecognizable anyway. In cases where "select-by-scale" is used, the map scale of the locator window will automatically be used. Vector Objects. It is now possible to edit the styles for labels styled "by-element". This allows the designing of objects using a small number of different label styles which can easily be changed instead of having a different style for every label. When setting the "all-same" label style, the sizes may now be set relative to the current or layout map scale. This allows exact control of label heights for viewing and/or printing. Theme Mapping. Many clients requested a modification to this process so that a subset could be interactively selected from the total range of the variable being theme mapped. Take as an example that five classes are to be distributed over an interior range greater than the minimum and lower than the maximum in that range. This optional capability has been enabled by allowing the first or lowest class (class 1 ranging up from the minimum value) and last or highest class (class 7 ranging down from the maximum value) to be excluded from the automatic distribution displayed in the theme map. By using this option, the interior range over which the remaining five classes will be distributed can be controlled. When starting, be sure to select seven classes if the option is being used and only the five internal classes are required. Just as in V5.40, the minimum value of the first or lowest class (1) cannot be edited, but the larger value can be. Similarly, the maximum value of the last or highest class (7) cannot be edited, but its lower limit can be changed. If the new interior range option is selected, the editable upper limit of the first class and the lower limit of the last class can be interactively moved in the statistics histogram panel. Moving them to new values will provide instant feedback as the five classes sought are redistributed in this histogram, the dialog box, and in the theme map when redrawn. Sketch Tool. The last "sketch" element can now be deleted if desired. If a polygon is "closed" it will be saved and drawn as a polygon element instead of a line element. > Fast Snapshots. The TNT professional products' printer setup and layout options were confusing beginning users of TNTlite and TNTatlas. They will be even more intimidated as V5.50 makes the Map and Poster Layout process and its color printing free up to 8.5 by 11 inches (i.e. P3 and P5 options are free) in all TNT products. Please note that printing controlled by levels P8, P10, P15, and P20 remain optional add-ons. To offer a simpler means of getting started, the Print icon button will now print a quick, first time snapshot in the Display process with additional special input selections. The new 24-bit color support of Windows printer drivers, discussed later, also increases the quality of the snapshot print. In the Map and Poster Layout procedure, a dialog box is presented to determine if the current layout should be saved before a view is closed or the process is exited. In Display Layout mode, this prompt to save changes to the layout will be provided only if a layout was previously "opened". These prompt modes may be changed via the new Options dialog box. * Display--3-D Perspective and Stereo (prototype process) General Improvements. The 3-D display process in V5.40 was the same basic design and coding as was available in DOS MIPS (a design of five to six years ago). This old process has now been removed from the menu and replaced with a totally new design which has been in gradual development for about one year. Initially, this new V5.50 process may appear to have the same general features and performance as the old 3-D process which it replaces, but with an improved interface. However, it is in the details where it greatly differs even in this initial release. A color plate entitled 3-D Perspective Visualization is attached to illustrate the results of the process. Supplemental documentation entitled 3-D Perspective is also enclosed. It should suffice to illustrate the advances in this new process to indicate that it now has most of the advanced features already familiar from the current, state-of-the-art 2-D Display process. For example, the flexible layer management and visualization tools are directly integrated including such things as:
Internally, the new 3-D process has been completely redesigned to
support the creation of many new powerful visualization features which will
continue to appear in V5.60 and later. Planned features include fly-by,
fly-thru, drive-thru, and related moving visualizations. Easier stereo viewing
is needed to create stereo from a DEM and ortho-image or perhaps any
georeferenced image using TNT's layer
restitution functions. Volumetric measurement tools which operate in this
perspective or stereo view are needed. Some experimental tools of this type
using the elevation raster directly are available for the first time in V5.50
as an isolated prototype process (see below). IMPORTANT: This powerful new visualization process is now
available as a standard feature without cost in TNTmips, TNTview,
and TNTlite. ArcView3 and MapInfo 4.1 users may never see this kind of
feature released as part of their standard product! A color plate entitled Hawaii is attached to verify the above statement and demonstrate the capabilities of this new process. This plate was created entirely in TNTmips. It shows some the interesting and attractive output views which can be created exclusively from a DEM raster object of the island of Hawaii. Full Stereo Viewing. Support for stereoscopic viewing of 3-D perspective views is available in both wireframe and solid rendering modes. Both stereo views are automatically generated for any complex 3-D surface which can be built in this process. This new feature simply generates two appropriate views to provide for viewing the surface in stereo. The stereo separation of the two views can be controlled by a slider to adjust the apparent depth. When fly-bys and drive-thrus are added, they will operate in stereo as well, although the computing power needed will increase accordingly. With today's desktop computers, a stereo fly-by will probably be computed at low resolution or off-line (batch mode) for later real time use. Any raster object supported by TNTmips can be rendered in 3-D perspective with or without overlays and thus in stereo. Thus many new kinds of data visualizations can be undertaken in perspective or stereo. More and more DEM and other geosurface elevation data is rapidly becoming available from many sources (government assets, airborne and space interferometer RADARs, ortho/DEM processes, LIDAR, surface modeling, and so on). But it is important to realize that this new perspective and stereo visualization process provide powerful visualization of other kinds of surfaces. Some examples of stereo views which do not use elevation as a surface might be:
The new sample TNTlite data sets include a stereo pair of a subsection of two NHAP CIR (color-infrared) images. These images are for Lake Chabot just up the foothills east of Oakland in the San Francisco area (also covered by the new San Francisco TNTatlas). Please note that this stereo set is even small enough so that it can be viewed in TNTlite. Use the cardboard anaglyph "glasses" provided with V5.50 to view this stereo image. Use these simple stereo glasses to inspect other monochrome stereo images. Please keep the anaglyph viewer handy to try with the new stereo features to be released in V5.60 and later. The electronic color stereo viewing devices supported by all the TNT products are still the same as those listed in V5.40 of this MEMO series. Please consult this list before buying any electronic stereo viewing device. Any client who has knowledge of new, low cost stereo viewing devices should provide information about them to MicroImages for possible support by the TNT products. Additional New Features. The short list below introduces other features that have been added to the new 3-D process which are new and have no equivalent in the 2-D process. 3-D Groups. It is now possible to create 3-D groups and place them in a layout for both display and printing. Each 3-D group can have multiple surface and drape layers and its own viewpoint. The 3-D groups have all the same features and controls available as in the separate Display/3-D Perspective process (described above). When setting the viewpoint for the group, the portion of the perspective view to actually place in the layout may be selected via an elastic box. For example, this is useful in creating panoramic views and other special effects. Side-View Stacks. Multiple surface layers can be separated vertically in a stacked "side-view" form. This concept has no equivalent in 2-D viewing. Using the "offset" field in the dialog box for each surface layer will set the spacing above the previously selected surface layer and create the "stacked" view. Each overlay layer added to a specific surface will be rendered in that vertical offset position and, as expected, be scaled to its surface layer. Stacking several such complex views offset vertically might use base surface layers which are not at the same scale. Use the "scale" field in the dialog box to specify a common (or different) scale for a composite view of the stack. A common map projection is used for the stack and will be that of the first stack selected. With the "offset" field selected in the dialog box, it is even possible to view "flat" layers in a vertically stacked perspective view. Interactive Viewpoints. The wireframe rendering for the surface layer is fast enough so that it is now dynamically updated when the viewer position is interactively adjusted, and not simply when requested. This allows rapid selection of an appropriate viewpoint. The wireframe density can also be changed for a more realistic view of the surface. After the mouse is released from moving the wireframe, a toggle option is available to automatically begin rendering the surface. These optional wireframe density settings and automatic solid surface rendering controls provide for control of how this process optimally functions on slower or faster computer platforms. Miscellaneous. There is an option to clip and not render the overlay data which falls off the edge of the surface layer. Outlapping of overlay data is quite common when the map projections and extents of the overlay layers vary and do not match the surface layer. Null cells in a raster surface layer are automatically considered as "off the edge". The numeric values for the viewer position and view direction are displayed and can be edited. This will allow precise specification of the viewing parameters. Future Plans. At the top of the list is a new 3-D rendering engine to speed up the solid surface display significantly. It will be worked on immediately in the next quarter. Foreground smoothing was present in the previous 3-D and needs to be added. Direct use of CAD, vector (e.g. contours), TIN, and database objects will be added for use both as surface drapes and also in place of elevation rasters. * Color Pattern Mapping. (prototype process) Prepare/Raster/Color Binarization... Background. A Color Binarization process was introduced in V5.40 to assist in automated tracing of solid color lines on scanned maps. It can still be accessed when the Method menu in the Color Description dialog box is set to Color Occurrence (the default). Please refer to the additional details about it in the MicroImages MEMO on the Release of V5.40 TNT products dated 21 July 1996 or in the on-line documentation. Printed maps also contain color polygon areas which must be captured as polygon elements. The simplest example of these would be the color screen printed areas on a topographic map which represent green = forested areas, blue = water bodies, violet = new urbanization, and so on. These areas may be bounded by a line of a distinct color (in geologic map) a dashed line, or nothing at all (in topographic map). They also contain many different color linear features and islands. In a complicated geologic map, color polygons are used everywhere to identify geologic materials or their properties. The varying colors which fill map polygon areas and make up the lines are often created by screened CMYK ink passes, and not by a solid ink color. If such a color map is scanned at a low resolution, its lines are not resolved for tracing. But, when the scan resolution is increased, then the screened colors in the color lines and polygon begin to separate into different color patterns. A new procedure has been added to the Color Binarization process which exploits these color patterns to fill in polygons or trace lines. Introduction. The new TNTmips color pattern mapping procedure is called "Pattern Occurrence". The mapping of printed color polygons from topographic maps was the design problem for the additional procedure. But, while it was designed for color map conversion, it also has immediate application to classifying multispectral imagery. Although it will work in some cases, it has not been designed for detecting and mapping the clearly visible color stripes and other patterns deliberately used in geologic maps and for some specialized features in topographic maps (e.g. orchards, vineyards, ...). Mapping these kinds of visible and large color patterns may be addressed in future development efforts. The Pattern Occurrence procedure defines a color pattern by the color of each cell paired with the color of its eight adjacent neighbors. But, imagine how many 8 by 8 cell color patterns could exist in a color map defined by red, green, and blue layers of 256 values each. This is almost 17 million colors for a given cell and a very large number of color pairs with eight neighbors. The first concern must therefore be the reduction of the number of colors, and therefore color patterns, to a computationally manageable number. Just as in the solid color line following method, the Color Binarization uses the same unsupervised neural network color classification process to reduce the color occurrence in the RGB image from 17 million to a number from two to 256. The neural network procedure in this Color Binarization process is essentially unchanged from V5.40. It uses the same algorithm whether the Color Occurrence (solid line method) or the Color Pattern (color pattern method) is selected. However, it is important to note that two identical successive applications of the neural network process will produce two different but closely similar color classifications. This is a fundamental concept of using neural network analysis in various disciplines. In the TNT products, the neural network procedures involve the use of random sampling of the raster layers to train the network as well as random number generation as part of the analysis. This neural network process is based on the pioneering work of a Finnish mathematician and has no theoretical explanation. That is to say, "it works because it works".
Modeling Concept Used. What Pattern? Zoom into a color scan (at 300 dpi) of the green vegetation area on a topographic map. It consists of a pattern of shades of green cells bordering each other together with some white or other light cells. It may also contain some red road cells and certainly some tan contour cells. The color pairs making up this area could be selected to define its color pattern including its anonymously colored cells. These color pairs define a color pattern which exists in the green vegetation polygons but not in others. After the color pairs pattern model is built, all the other cells in the map could be checked to see if they match all or only a portion of the color pairs. This would yield a binary map of those cells which match this color pattern and make up the green polygons. Defining a Color Pattern. The procedure uses a supervised classification approach to map area features. It is trained with a sample area which is selected to represent a particular mixed color pattern which exists in the color classification raster of two to 256 colors. This training set is used to construct a co-occurrence matrix (the classification model) of color pair frequencies. The pairing of a color cell with its eight neighbor cells (eight color pairs) defines a color test pattern. A neighboring cell is any of the eight surrounding cells which border that cell. The aggregate of all these groups of eight color pairs for all the cells inside the training set define that color pattern model. The co-occurrence matrix constructed from the training set has all the possible cell colors (two to 256) in the input color classification raster on each axis. The matrix cells contain the count of the number of times all pairs of colors occur adjacent in the training area. Every cell in the training area can be paired with eight neighbor cells and thus contributes eight counts to this co-occurrence matrix. When the neighbor cell is taken as the center cell, it will also contribute an identical color pair count to the matrix. Excluding boundary cells, this tabulation would produce a symmetric co-occurrence matrix. That is to say, the count for color A paired with B would have an identical value across the diagonal with color B paired with A. Next note that boundary cells in the training area have neighbors outside the area. When the boundary cells color pairs are tabulated into the co-occurrence matrix, it becomes asymmetric. Cell A at, but inside, the boundary gets paired with B outside and is counted in the matrix. But cell B just outside never gets paired or counted with A. The amount of asymmetry in this co-occurrence matrix can be just a little. This would result from a large circular solid training area. It has the least circumference of any polygon and a lot of interior cells. On the other hand, a small and linear training set can produce significant asymmetry. It has a lot more border cells relative to interior cells. Adding Uniqueness--the Pair-Count Test. At this point, every cell in the training area will be identified as part of the color pattern by this model. Every unknown cell outside this training area in the color classification raster can be easily and quickly tested to determine if it is part of this modeled color pattern. This test can require that the unknown cell and one of its neighbor cells match a color pair in the co-occurrence matrix. If the result is yes, then that cell can be coded as part of a polygon containing that color pattern. But, each cell has eight color neighbors defining its color pattern. Thus the test for a match of any single color pair of the unknown can be made more rigorous. The color pair match count for the unknown cell being tested could be set from between one and eight. If selected as eight, then the unknown cell has to have a match in the co-occurrence matrix for all eight color pairs. If this pair-count test is true, then the unknown cell can be coded as part of a polygon containing the color pattern. Eliminating Randomness--the Frequency-of-Occurrence Test. Color patterns resulting from printing screens have some color randomness within them due to many factors. The pattern match model can be further refined to handle these conditions by using the frequency with which color pairs occur in the co-occurrence matrix. A color pair that occurs only once in the training area is not likely to be typical of the color pattern. Suppose that one cell of a given color of pink occurs once in the training area, that none of its eight neighbors' cells match each other, and that cells of this exact pink color are common elsewhere outside the training set in other color patterns. This pink cell will show up as a single count of one in 16 different locations in the co-occurrence matrix. All other positions in the row and column which contain it will be zero as it never pairs with those colors. A frequency test can be applied when the color pairs for each unknown cell are tested. This test can require that each color pair of the unknown cell finds at least two matches in the co-occurrence matrix. Every unknown cell of the special pink color will fail this test. A pink cell outside the training set can never encounter a count of two or greater in the co-occurrence matrix for any of its possible 22 to 2562 color pairings as its row and column is filled with zero or one. Increasing the level of this frequency-of-occurrence test in the co-occurrence matrix to three requires that two cells of this color occur in the training set. It also requires that they occur in such a position to have an identical second color as a neighbor cell. As this frequency value is increased, it begins to eliminate isolated unique cells from the model. Types of Errors. Since this whole approach is just another kind of image classification and labeling, its results can be examined in terms of commission and omission errors. Commission errors are where any cell outside the desired color pattern is incorrectly identified as actually being a pattern cell (i.e. the cell is committed to the wrong class and gets a one assigned instead of the zero in the binary raster). Commission puts in a "pepper" pattern in areas outside the actual color polygons. Omission errors are where cells actually in the color pattern are not identified as part of the pattern (i.e. the cell is omitted from its proper class and gets a zero when it should get a one). Omission puts "salt" holes in the actual color polygons. Getting Started. Input the Proper Scans! (this section reproduced from V5.40 MEMO) V5.30 introduced a new color interactive line following process into the object editor to be used for semi-automatic map digitization. An integral part of this process is an automatic neural network color compression step. This step automatically reduces any 16-bit or 24-bit RGB color composite raster object selected for tracing to the selected number of important colors. It learns and presents the distinct colors making up the map. It weights the selected number of colors toward their separation in a color sense or model (e.g. color distinctness) but not necessarily toward the map area that the color or very similar colors occupy. It tries to find and group each distinct color together. For example, when used for reducing a 24-bit color raster for presentation on an 8-bit display board, it would preserve a sharper transition between the red cells representing a road and their white background. The standard 8-bit color compression method used everywhere else in TNTmips: in the scanning program, RGB to 8-bit color compression, RGB display on 8-bit color display boards, etc. optimizes image appearance (map or photo) based upon the cell area that the colors occupy. It is designed specifically to reduce the color quantization or color banding in a continuous tone image viewed on a 256 color display board. Used on a map with distinct colors, it will tend to present a good looking display of the map for similar reasons. These 8-bit color compressed raster objects can be selected for use in the color classification process and the color interactive line following process. If this process is started by selecting an existing 8-bit color composite, the power of the "front-end" neural network analysis is aborted, and all you're doing is selecting colors from the existing color palette. For best results do not do this. Use these two processes on 16-bit and 24-bit color raster objects so that the automatic neural network compression method designed for them is applied. The Color Binarization process can also be applied to an 8-bit grayscale scan of images and used to threshold out a particular gray range(s) and convert it/them to a binary raster object. Since the input is an 8-bit raster object, the neural network color compression step is appropriately skipped. MicroImages has found that this is a particularly effective way to extract "drawn" black boundary lines from scans of grayscale images. An example of this application is to extract the black soil polygon boundaries from a grayscale scan of the printed county soil map sheets available for all the United States and printed and distributed as a grayscale airphoto with the black soil boundaries. Often a grayscale scan of a color map or an annotated image will preserve black lines as the blackest values for a similar threshold extraction. Neural Network Color Classification. To use this new procedure, start the Color Binarization process at Prepare/Raster/Color Binarization. Select the Method button to switch from Color Occurrence to Pattern Occurrence. Next select an 8-, 16-, or 24-bit color composite raster object to be processed. If you select an 8-bit raster, the process immediately continues on to display it in the view window. If you choose a 16- or 24-bit color composite raster, the neural network procedure provides an auxiliary Classification window where you can select the number of color classes to be produced (two to 256). Also set the sampling rate for the training of the neural network color classifier. Try using with 64 colors for the first experiment. The default for this window uses every 8 by 8 pixel to train this color classification process and is usually adequate unless the input objects are small rasters or represent unusual source materials. The higher the sample rate selected and the greater the number of colors, the longer it will require to prepare this color classification raster. Select the Apply button to continue, and specify the destination of the raster object to be created to contain the color separated result. When the new 8-bit color object has been created, the procedure returns to the Color Binarization control window and will use this raster object and display it in the view window. General Operation. The Color Binarization dialog box contains three of the new tab panels when in the Color Occurrence mode. Select the middle Pattern Controls panel or the Color Occurrence mode. Leave all other settings at defaults, and push the Polygon icon in the view window. Use the polygon tool to outline a training set in a color pattern of interest. Use the left mouse button or the Apply button in the Line/Polygon Edit dialog box which has been exposed. This will compute the color model (co-occurrence matrix) for this training set. All color pairs making up the current model appear in the Co-occurrence Pairs window in the Pattern Occurrence panel. Do not try to choose a clean color area of the map polygon if it is crossed by roads, contours, lettering, and features of quite different colors. This will defeat the purpose of this powerful process. Including these features of different colors is important to filling in the desired color map polygons. Including these diverse colors does not necessarily mean that these kinds and colors of features will be identified (committed) outside the desired color map polygons. Remember again that it is the color pairs that count. A red road on a yellow airport background creates a subset of color pairs in the training set to map these cells into the polygon. Outside the polygons the same kinds of red roads are printed on a white background and create a different collection of color pairs (red with white). it will take an hour or two of experimentation to get an idea of how this works. A color pattern mapping model is now available for testing and refinement. A small zoomed in color test section of the color classification raster appears in the Color Binarization dialog box. Push the Show As Binary button immediately above it. The binary classification raster for this same image area immediately appears. Black cells represent unknown and training set cells identified as part of the current color pattern. White cells represent unknown cells which remain unknown. Evaluate how well the current model is by moving the Show As Binary window around over the larger map area. To do this, use the arrow icons or select the blue Crosshairs icon in the view window. This will expose the long crosshairs which can be used to select any location in the view window for an instantaneous evaluation of the color pattern model. As the binary model is moved over areas of the color map polygon and other areas, examine the white and black (salt and pepper) errors of omission and commission which remain. At any time an additional polygon can be drawn to add area to the current training set. At any time the large crosshairs can be positioned on any single cell in the view window, and the right button will add that specific cell to the training set. A Reset button is available to clear the current training set and co-occurrence matrix. Tuning the Color Model. Three interactive procedures are provided to refine a color pattern model:
The binary raster window will provide immediate feedback for any area of the map as you select and adjust these controls. Pair-Count Test. The neighbor or pair count defining the color pattern can be selected from one to eight since each unknown cell to be tested has eight neighbors. Selecting one requires that only one of the unknown cell's color pairs match a color pair in the training set. Cells with a single color pair identical to those in the sample area can exist elsewhere in the raster outside the desired color printed area for many reasons. These cells will show as errors of commission in the binary inspection window. But, these errors of commission are drastically reduced by requiring that the unknown cell has more than one of its neighbor cells matching the color pairs in the training set (increasing from one to eight since each has eight neighbors). Specifying that all eight color pairs must match will create a very "tight" color pattern model with a low probability that all eight color pairs of an unknown cell will match pairs in the training set. This means that the color pattern is more rigorously defined. As a result, errors of commission are very low. But, now some cells within the desired color pattern also have a reduced probability of matching, and errors of omission may increase beyond acceptable levels. The level of this color pair match requirement can be set by selecting a value of one to eight for the Matching Pairs number in the Pattern Control Panel. Interactively evaluate the effect of changing this model parameter by testing its impact in the binary raster window. Edge Effects. Edge dilation of the desired color pattern area can occur if a five or larger color pair match is required unless the edge of the color pattern area is always a straight line through three cells. An irregular boundary of cells just inside the map polygon will be eroded (omitted) a little by a setting of five, a cell count of six will increase this erosion and so on. A setting of eight will peel away almost all the single row of edge cells inside the map polygons. Increasing the pair count test value also increases errors of omission in the map polygons sought as it becomes more difficult for the cells within the desired map polygons to meet this criteria. If the color pair match criteria is four, the default condition, the edge erosion will be negligible. Reducing the Matching Pairs value below four will begin to dilate (expand) the edge of the pattern slightly. Reducing the value also helps fill in almost all the map polygons. But this also weakens the model, and errors of commission increase for areas outside the desired color map polygons. Frequency of Occurrence Test. Cells with unusual colors can occur throughout all color map polygons like noise. These contribute many relatively unique color pairs to the color pattern model. This will weaken the model and increase the probability of errors of commission. Such unusual color pairs can be excluded on the basis of their low frequency of occurrence. This test criteria can be set with the Min. Frequency value in the Patterns Control panel. It can be changed from its default of one to threshold out low frequency color pair occurrences inside the training area. When set at 10, color pairings of total nine or less in the current model will be ignored. In other words, counts of nine or less for a color pair would be treated the same as a color pair which is totally absent from the training area. Increasing this threshold value just above one can drastically reduce errors of commission for all the cells tested. Run. Throughout the training or tuning process, the binary raster test window provides immediate feedback on the results of the current color pattern model. When the results in the binary are satisfactory, use the Run button to compute and save the binary raster object of the classified color pattern area. When the first binary raster is finished, continue on to define and map additional color patterns. Since the color pattern mapping is not perfect, it may next need to be cleaned and edited in some other TNTmips process such the morphological and filtering functions or with the raster editing procedures in the Object Edit process. The binary raster objects can be converted to vector objects by using the autobounds process. Remaining problems can be addressed with the vector editing procedures in the Object Edit process. Also try the new Auto-Trace method being released in V5.50 directly on these binary rasters. It will immediately convert the binary raster to a vector polygon. The built-in removal of spurs in the process cleans up the edges since no lines should be present. The built-in polygon area filter is also very useful to eliminate small noise polygons caused by the remaining errors of omission and commission. Special Experimental Output. There are two new experimental output options on the Color Binarization process file menu: "Create Co-occurrence Raster" and "Create Probability Raster". The production of these two optional rasters is experimental and might be useful for some scientific applications. Create Co-Occurrence Raster. This procedure will create a co-occurrence raster object for the entire 8-bit raster object selected for use in this process. It is not the transient co-occurrence matrix which is created from the training set in the Pattern Occurrence classification model. Since it is a raster object, it can be selected and viewed immediately in this same process or any other view window. This raster is a probability matrix that shows probability for co-occurrence of color "i" with color "j". The value within the matrix is a count of the number of cells of that "ij" pair that occur in the input raster cells converted to a probability. It is computed as the count of each co-occurrence "ij" divided by the total of co-occurrences of color "i" with all other colors. Thus the sum of probabilities of co-occurrence for color "i" with all other colors is equal to probability 1.0. It will be a symmetric matrix, as the cells in the outer band around the edge of the raster object are not included in the tabulation, as they do not have eight neighbors. The matrix is well-known as the Markov probability matrix and defines a simple Markov (chain) process. Think of it as a model of the probability that a test cell of color "i" will occur next to a cell of color "j" (testing all eight neighbor cells). In many other modeling processes, this Markov probability matrix would show the probability of something at state "i" converting to state "j". For example, it could model the probability that a cell of agricultural land adjacent to a subdivision cell will be converted into a subdivision. Create Probability Raster. This procedure will create a color model probability raster object for the entire 8-bit raster object selected for use in this process. Each cell in this probability raster matches a cell in the input raster. Since it is a raster object it can be selected, viewed, and studied immediately in this same process or any other view window. This new raster contains the sum of the eight co-occurrence probabilities for the current training set or color pattern model for each cell in the input color raster. Each cell in the input has eight possible combinations with its neighbor cells and each combination has its own probability associated with it in the co-occurrence for the current training set. The sum of these probabilities for a cell will be zero for input cells of a color not represented in the current training set. Why, because with any color neighbor this cell will have a zero in the co-occurrence table for the training set. The sum of these probabilities can be greater than 1.0 for eight color pairs, each of which is encountered frequently in the training set. Why, because each of its probabilities will be high since its eight color pairs individually occur at a high count in the color occurrence raster. A solid, single color raster would produce a value of 8.0 for every cell. These probabilities are actually rescaled linearly into the 8-bit raster so that a value of 0.0 becomes 0 and the value of 8.0 becomes 255. However, many of the cells in this new raster will have low values as the co-occurrence matrix for a training set has many zero positions. Why, because only a few of the total number of colors actually occur in the training set. Viewing this raster in grayscale provides an indication of how the binary process is going to work for that color model. Bright values are likely to be selected by the model, and the many dark cells are not. Image Processing Applications. Concept. Applying this procedure to multispectral imagery will often require the analysis of more than three radiance bands. To do this, first apply an unsupervised classifier via the Classification process at Interpret/Raster/Classify/Automatic. This unsupervised image classification step will reduce a very large number of radiance level combinations for each cell into the selected 2 to 256 values. Select the new 8-bit classified raster object in the Color Binarization process. In this case, the colors assigned to this classified image by the process are somewhat arbitrary and merely help to visually distinguish the classes. In other words, colors which appear similar can represent entirely different classes. But, the application of the Color Pattern mapping procedure provides a powerful tool for labeling these classified cells by how they occur in areas as part of color patterns. Think of it as a sort of color (actually radiance) texture analysis and simple spatial classification. 3-band Application. Use the LANDSAT TM sample data for the Crow Butte map quadrangle to try the application of the Color Pattern labeling process to image analysis. Since these CB_TM.RVC rasters are less than 512 by 512 cells, this exercise can also be done by students to introduce them to advanced image analysis logic in an interactive procedure. First prepare a 24-bit color composite raster from the three TM spectral bands which would normally be used in a color infrared view (Red, Green, and Photo-IR). Use Prepare/Raster/Convert Color to do this. This color composite raster can then be selected in the Color Binarization process for neural network analysis to yield a 256 class raster. Since only three spectral bands are used which have some familiar color interpretation, the view window will contain the familiar color infrared image map of Crow Butte. Use the pattern training set methods described above to map the agricultural fields in this image. N-band Application. Using the same LANDSAT TM sample data, experiment with using more than three bands. Try any unsupervised classification procedure and produce 256 classes or less. Choose the 8-bit classification raster produced in the Color Binarization process and proceed with the pattern labeling. Modifications Since V5.50 CDs. Neural network unsupervised classification procedure used in this process is now almost five times faster. Future Plans. When a human is set to work to interpret an airphoto, they might be asked to map out different but still green forest types or to find pink cars in the green forested background. These different starting instructions would cause any of our personal neural networks to operate quite differently. In a future release, the TNT products will provide varying startup or initial conditions for the neural network procedure. These conditions can be thought of as very simple user supervised training of this unsupervised process. For example, if it is specified that the rasters represent a natural color airphoto, then the final color classification will be controlled by this apriori knowledge of the goal (i.e. perhaps weighted to result in many green hues for trees but fewer other colors). In this Color Binarization map application, the apriori knowledge would be that the rasters are RGB scans of a printed color map where a few pure colors and their variants are needed. The Color Binarization process could be integrated into the new Auto-Trace process. This would provide immediate feedback on how well the current color features could be converted into vector elements. An expanded Auto-Trace process could handle the conversion of multiple features at one time. For example, a color pattern model could be developed for the green vegetation map areas and a polygon name associated with it, a second model could be immediately designed for blue water map areas and named, and so on. Auto-Trace could subsequently complete the iterative analysis of the entire color raster with each model to produce all the identified polygon types in a single or separate vector objects. * Bilinear and cubic-convolution are now available for 16- and 24-bit color-composite rasters. A full fledged database object manager is not available as planned for V5.50. This is more complicated than originally envisioned when requested by several clients. It will take some serious planning and implementation, probably as an evolutionary approach. When assignments for V5.50 were made, this new process was to be for internal, simple editing of tables as is already provided in all external database products. But, in TNTmips, relational database tables can serve two major roles:
As a refresher, a geospatial database object (alias geodatabase object or primary database object) is a table or relational string of tables which is not attached to another primary object (raster, vector, CAD, or TIN). This geodatabase object qualifies as a primary object as it contains fields such as lat/lon which can be used to directly map its other fields onto the earth or other 2-D/3-D coordinate reference system (e.g. like Pin Mapping). A geodatabase object available for use in the TNT products may reside in the TNT product, Access, Oracle, ... As it is used in the TNT product, it will very likely acquire and carry along georeference, map projection, or other TNT subobjects as appropriate even though the table stays in its original warehouse. Providing editing and other database maintenance capabilities within TNTmips to handle tables as primary database objects was initially proposed for V5.50. However, this is the purpose and direct function of database software such as dBASE, Foxbase, Access, Oracle ... where more and more database objects will now reside. The ODBC capabilities being added provide support for database tables which will remain in and be maintained by their own external database management systems or warehouse. Granted, it would be useful not to have to learn these other software packages to edit external tables. Granted also, more tools are needed to manage the internal TNT geodatabase objects. The real database editing challenge is now more clearly defined as the evolution and structuring of the TNT tools needed to edit the string of related tables which contain attributes. The V5.50 Grapevine MEMO contains a number of comments from ARC/INFO users strongly criticizing the INFO tools available to them for such purposes. The real challenge to all serious GIS software is to provide understandable, rational editing tools which can be used to edit relational tables containing attributes! This must be done in such a way as to best preserve, manage, create, ... the linkages to the graphical elements. It is not simply a matter of supplying a set of simple tools to add, join, delete, compress, and perform other such operations on the related tables without regard to the impact of these changes on maintaining or creating correct and sensible attachments to their raster, vector, CAD, or TIN objects. It also follows that the tools designed for editing attribute tables with this bigger goal in mind will be equally applicable to editing geodatabase objects. Three new table editing tools have been added which do not alter the relationships between attribute tables and graphical elements. They also take into account the possibility that the tables being edited may actually be warehoused elsewhere in some other complex database management system. For example, a computed field in a TNTmips table can use fields from tables in Access. This is flexible, as whenever Access changes these fields, the computed field in the TNTmips table changes. However, at any time this computed field can be converted into a real field and the link severed between TNTmips and Access. Computed Fields. V5.40 introduced the concept of creating a temporary computed field in any related table. Computed fields have the advantage that they are automatically redefined at any time by changes to any of the other fields they evaluate or to the combining expression used. However, because they may change, computed fields have the disadvantage that they must be carefully used as a primary field in the direct attachment of attribute tables to graphical elements (cells, polygons, etc.). For example, if the expression is tuned or changed, which is a good reason for having computed rather than real fields, then the attachments can be lost or garbled. V5.50 allows computed fields to be made real. To do this, bring up the table in tabular view mode, select the field by clicking on the heading. A computed field always has a blue heading or field name whereas that of a real field is black. After selecting the blue field heading, select "make permanent...". A simple example will illustrate the use of this table editing option. The polygon outlines of all the 3000+ counties in the United States have been encoded with a single field for all counties and states together (e.g. SSCCC). A statistical table of county information is available to attach to this which contains a field coding states (e.g. SS) and another field separately coding counties in the state (e.g. CCC). In this table, county codes are repeated in the various states. To overcome this limitation, a computed field can be defined as SS * 1000 + CCC. Since it will be used as a primary field to attach to the county outlines, it can now be made permanent for this purpose. * Add Fields. A new field can now be inserted after any existing field in a table. Previously each new field was automatically added to the end of the table after the last existing field. A new field can be added anywhere the TNT database editor is available via its Edit Definition dialog box. Adding a field does not change the relational structure, and thus has no impact on the attachments to graphical elements. The actual values inserted into the new field can be edited manually or created in some other subsequent process. * Delete Fields. Any field in any table can now be deleted from any place in TNTmips where the database editor is available. The delete button was available in the Edit Definition dialog box in previous versions of TNTmips. However, it was only active immediately after a field had been added and before the "OK" button was selected causing the actual alteration of the table. Now this button is active wherever a client has write privileges for the table(s). Deleting a field in a specific table will not add, delete, or rearrange records in that table. The table from which the field is deleted will be compressed when "OK" is selected. Deleting a field directly linked to graphical elements or to records in some other related tables will break the relational attribute structure. Deleting only independent fields is recommended. Considerable caution should be exercised in deleting "shared" fields. Deleting a shared field can alter the relational structure in the tables or the values in other fields. For example, if a "key field" is deleted, the connection to the related tables are lost. However, an attempt to delete a "key field" will result in a warning and a request to confirm. Deleting a field which is shared because it is used as a variable in a computed field will alter the value in the computed field to unknown. At present, since this kind of field does not "know it is shared" in the computation, no special warning is given if it is deleted! * Enterprise Databases. (prototype process) Support for Open DataBase Connectivity (ODBC) has been integrated into V5.50 of the TNT products. Expansion of the TNT products' ODBC support to include databases on the Mac and UNIX platforms will occur in V5.60. However, ODBC and its enterprise oriented database interface is now available for use in V5.50 of the TNT products used under W95 and NT. This interface was tested with Microsoft Access and with DBASE. It is about to be tested with NT Oracle. What is ODBC? [Quoting from Using ODBC 2, 1995, Robert Gryphon, QUE Publication ISBN: 0-7897-0015-8, pages 11 and 12.] "Open DataBase Connectivity (ODBC) is a database access library. It does just about the same thing as Sequiter's CodeBase, the Borland Paradox Engine, Informix, ESQL, or dozens of other libraries. It allows your applications [e.g. the TNT products] to manipulate data in a database." "What, then is so different about ODBC? You probably have invested years of your life learning other database libraries. There is probably little appeal in learning another." "ODBC does have one major distinction. It can manipulate almost any database. It can access DB2 on an AS/400 [DB2 is IBM's mainframe database product]. It can manipulate bTrieve files on a laptop. It can access files that you might not even consider to be databases like Excel spreadsheets or ASCII data." "Surely there must be some shortcomings? The biggest limitation, at the time of this writing, is where ODBC operates. This book assumes that you're using one of Microsoft's Windows ODBC implementations, either the 16-bit Windows or the 32-bit Windows NT [or W95]. "Applications can use the ODBC Applications Programming Language (API) on either platform with little or no change in the source code. A UNIX implementation is now available from Visigenic (see following Note), and while this book does not specifically address any idiosyncrasies it may have, 90% of the material covered should apply equally to that platform." "Note: Microsoft has promised ODBC software [in 1995] for the Apple Macintosh [now available] and computers running under UNIX. There are also plans to make ODBC available for extended MS-DOS applications. In October 1994, Microsoft licensed Visigenic Software Inc. to provide ODBC API's and drivers on non-Windows platforms. At this writing [in 1995], Visigenic has thus far provided only a UNIX implementation." [MicroImages has just checked directly with Visigenic, and they have ODBC support available now for the PMac. However, ODBC support has also just been provided by Apple to Developers such as MicroImages and will subsequently be distributed without cost with the TNT products.] "You might be curious as to how ODBC can manipulate such diverse databases. That's the easy part. The vast majority of databases conform, in whole or in part, to the relational database concepts of E.F. Codd. Even products based on file management systems predating the relational model map well to it. ODBC looks at databases for what they have in common, not how they differ." The ODBC drivers for Windows and Macintosh database products are available as part of the purchase of those products (either free or as an add-on option). In some cases, such as Microsoft Access, NT Oracle, Foxbase, and most other PC database products, ODBC support is free with the database. For UNIX workstations, Visigenic's ODBC support is available from the vendor/manufacturer of the database, but at an extra cost. A more detailed discussion of availability of ODBC for UNIX database products has been provided by ESRI a few paragraphs below. MicroImages has just checked directly with Visigenics and learned that the necessary ODBC developer libraries are available to purchase for the Sun (Solaris 1.x and 2.x), HP (HP-UX), IBM (AIX), and SGI (IRIX). Support for the 64-bit DEC UNIX is being developed. Drivers for each specific enterprise database on each platform are available for Informix, Oracle, Sybase, and others. The steps necessary to configure the ODBC driver to be addressed by a TNT product for access to an external Data Source will depend on the database package involved. The best place to look for help is the manual that came with the database software and its ODBC product. Once the ODBC driver is properly installed, the system has an ODBC Data Source available to other software. Setup of ODBC. Configuring and using an ODBC Data Source is an integral feature within W95 and NT operating systems. After all, Microsoft invented it. ODBC support is free with Microsoft Access when purchased as a separate product or as part of the Office Pro 95 collection, so it will be used here as an example. Integrated ODBC access has also just been released by Apple as an add-on extension to the current MacOS. Using W95 and NT, an Access based Data Source will be outlined here and in the attached black and white plate entitled: Using ODBC Data Sources. Before using a Windows ODBC Data Source, it must first be configured using the ODBC Control Panel. This is a Windows Control Panel, not part of the TNT products. If within Windows an ODBC Control Panel icon is not available, then the ODBC driver has not been installed. See the manuals which came with Access to configure it as an ODBC Data Source. Once an ODBC Data Source such as Access has been configured, the TNT products can establish and maintain a link to it to directly use its data, or to simply import tables from it. To do this, in TNTmips and TNTview go to Prepare/Import/Export and pick ODBC from the list of database types to import from. As previously, TNTatlas has no import and can use these external Data Sources only for HyperIndex stacks prepared in TNTmips and TNTview. There can be more than one ODBC Data Source available to TNTmips or TNTview either locally or via a network. Clicking on the "Source..." button will bring up a list of ODBC supported Data Sources that have been configured and are available. Select one. From this point on, it's the same as linking to or importing any other database already supported by the TNT products. MicroImages would like some input from clients who use, or plan to use this ODBC Data Source support. Import or directly use database table(s) into TNTmips or TNTview from a Data Source through ODBC. If it doesn't work, let MicroImages know what database format it is, and what ODBC driver was used. MicroImages currently has only a beginning collection of ODBC drivers and sample data sets. Those databases which clients are testing and providing feedback with will get first attention this quarter as this TNT to ODBC interface is advanced. If an error message appears stating there are no Data Sources or tables, then your Data Source is not configured correctly. If you get any other error message from Windows, please report it. Please include the following information in your report: a. The exact text of the error message, including module names and line numbers. b. The ODBC driver being used (e.g. Access, dBASE, ...) including the version number if readily available. c. Where the driver was obtained (e.g. came with Access, downloaded from net, ...). Use with Excel? Users of TNTmips routinely move databases tables to and from their Excel spreadsheets for modeling. Excel can quickly and easily read or write a spreadsheet to and from a database table. The spreadsheet is then used for creating complex models for data from and going to geospatial analysis. The Windows 95 and NT systems both show an ODBC driver for Excel. See this driver showing up in attached black and white plate entitled: Using ODBC Data Sources center window labeled Data Sources. The implication of this is that TNTmips can now exchange data directly with Excel. This recent discovery is now being investigated and will be reported upon more thoroughly. Restrictions. No W3.1 Support. W3.1 is a 16-bit based operating system under which the TNT products function as 32-bit applications. MicroImages does not plan to spend the backward investment in time needed to add support to the TNT products to communicate with the 16-bit version of ODBC for this gradually fading operating system. The few clients still using W3.1 platforms are using earlier database products such as dBASE III or IV, Foxbase, ... TNTmips and TNTview already have good direct support to link to, and use or import tables from these simpler database products. In fact, the Microsoft ODBC support for dBASE III and IV is limited as noted in 2 below. Don't use with dBASE. An ODBC driver is supplied by Microsoft with W95 and NT for the dBASE III and IV products. This specific driver is poorly designed and will support only limited queries. A better ODBC driver may be available directly from Borland. The Microsoft ODBC driver for dBASE confines a query to a single table and to the evaluation of a single field! However, this restriction poses no problem with the TNT products. Simply use the previously available, built-in capability (not ODBC) to link to these dBASE tables and continue to query them just as in V5.40 and earlier. Only Relational Links to Data Sources. The TNT products cannot rely on the records in an ODBC table being in any consistent order. Thus, records in an ODBC linked Data Source's table cannot be directly attached to TNT graphical elements. However, they can be related through a key field in an internal TNT attribute table. Labeling graphical elements in TNT objects always creates at least one field in an attached table (e.g. a soil identification code for soil polygons or for land parcels). Since this table is internal to the TNT product, it is always available to be used as a key field to relate to the records in the tables in an ODBC linked Data Source. No Write Support. TNTmips 5.5 cannot write to ODBC tables. This will be added later, probably in V5.60. Remember that these external enterprise level Data Sources (or even local, personal Access tables) are created and controlled by the external database software and their owner. Protection of large or valuable relational databases means that they are often locked by the database software (e.g. Access) and only the database manager has write privileges. Clients who want to create new data or modify existing data directly in the ODBC linked Data Source will need to wait for the implementation of this feature. In the meantime, continue as at present to create an Internal table(s) and export them via dBASE format to Access or some other database product. A Little Wine from the Grapevine. An Internet exchange via the ESRI list server was posted on 22 October 1996 by an ESRI customer after the V5.50 Grapevine MEMO was printed. It is very pertinent to reproduce this material here in this New Features section as it helps to further clarify MicroImages' decision to release in V5.50 an ODBC to interface to external enterprise databases and database warehouses. As usual, items in [ ] are added for clarification. [Statement] "I received an informative response from [a name at ESRI]. My co-worker responded with more questions for better clarification and [the name's] complete reply follows." [Question 1] "Dear ARCers," "A couple of weeks ago the question was posed: 'Does ESRI Support Microsoft SQL Server'. The answer was 'ONLY for Windows NT not UNIX'. I was wondering if ESRI is planning/intending on talking to SQL Server (or another ODBC PC database) in the future so the UNIX and NT versions [or ARC/INFO] will stay on the same level of abilities. I'm asking this for my PC-loving co-worker who wants me to access a database of shapefiles on a PC that he is creating. Any thoughts, discussion, knowledge, will be appreciated." [Response from the ESRI DBI (DataBase Interface) programmer] "The answer to your question is that the plans and intentions of ESRI don't enter into the discussion. The fact of the matter is that Microsoft does not provide either a UNIX version of SQLServer or the necessary UNIX client software to make a connection from UNIX to a PC server. Until they or some third party do so, there is not much ESRI can do about it. I should add that if UNIX to PC connectivity software for SQLServer became available, it would most likely be something you would have to pay extra for." [Question 2] "Hello, I am the PC-loving co-worker, the issue is this: Does ESRI intend to work with, or support an enterprise system of sharing data that is based on any industry wide standards? Are there any standards in the UNIX world equivalent to ODBC?" "It is clear that ESRI supports this standard in the Microsoft operating system (NT and Win95; with Shapefile ODBC drivers). You mention support for Microsoft Access v7, will this be through ODBC, is the ARC/INFO Microsoft SQL support through ODBC?" "The question is not will ARC/INFO support Microsoft SQL Server v6.5 but will ARC/INFO support a Open DataBase Connectivity standard of any kind that is platform independent." "A quote from Inside ODBC, by Kyle Geiger, Microsoft Press page 6, 'ODBC has had a long and favorable relationship with several standards organizations. The X/Open SQL Access Group, ANSI, and ISO are all working on advancing the core elements of the ODBC API through their respective organizations.' ..." "'ODBC is a cross-platform solution. From its conception ODBC was designed to work on various operating systems (and with various programming languages). Portability is a primary concern of the standard bodies ... Microsoft has licensed the source code to ODBC to third parties who have provided or will soon provide ODBC on the Macintosh, a variety of UNIX platforms, OS/2, and other operating systems.'" "Is this information wrong? I would think that ESRI would have some insight as to ODBC or some equivalent standard being brought to UNIX, since that is the OS that ESRI has a lot of experience with, as well as ESRI's experience with ODBC. My intention is to build a data system that is ODBC compatible, Platform Independent, there is not problem in that because all players support ODBC, except UNIX ARC/INFO. I understand that it is not ESRI's responsibility to create a standard for UNIX. Does ESRI (or you) have any insight into the inconsistency with the type of data sources that the PC version and the UNIX version has access to. Or do I tell my UNIX-loving co-worker that she should move to NT if she is to participate in the enterprise? (Not likely to happen and I shouldn't have to)" [Response from the ESRI DBI programmer] [The programmer obviously would have had to check and formulate the following response via management policy.] "Sorry to take a while getting back to you. You ask:" "The question is not will ARC/INFO support Microsoft SQL Server v6.5 but will ARC/INFO support a Open Database Connectivity [ODBC] standard of any kind that is platform independent." "The answer is that we would like to, if one was available. The fact is that ODBC has not yet caught on in the UNIX environment as strongly as it has in Windows. To the best of my knowledge, there is not other standard out there which is comparable - on any platform." "ODBC is essentially free and widely available on Windows. You do have to purchase a driver in some cases, but ODBC itself is free. Many of the DBMS [DataBase Management System] vendors supply a free ODBC driver, for example, Microsoft provides a free driver for SQLServer, and they also supply the Desktop Driver Pack (which supports Access, dBASE, Excel, etc.) for free. I am fairly certain that Informix provides a driver for free too. There are third party vendors such as Intersolv and Visigenic which supply a large variety of drivers for different DBMS." "The situation is quite different on UNIX. ODBC is not free. The sole source is Visigenic. You have to buy both ODBC and the driver. I was informed by a co-worker that Visigenic charges $150 per seat for ODBC on UNIX. There are some UNIX platforms which are not supported at all. What should ESRI do? Should we pass on the $150 per seat fee to each UNIX ODBC user? Should we tell users that on an unsupported platform that they are out of luck?" "You asked if the DBI [DataBase Interface] support of Microsoft Access [by ARC/INFO] was through ODBC. The answer is yes. This is also true of the Microsoft SQLServer. In fact, there is only one interface that is being used for both. So far, everything looks very good for the SQLServer. Unfortunately, we are having some big problems with the ODBC Access driver - the driver appears to be substandard in my opinion. This has been a big disappointment - Access works great - it is the driver that has a lot of limitations." [It can be assumed that the above comments primarily reference the use of ARC/INFO on an NT based platform or network also using Access. MapInfo V4.0 supported Microsoft Access via ODBC. MapInfo V4.1 also just added a new interface to Access via a Microsoft direct interface called DOA (Database Object Access). As Microsoft developers, MicroImages has the DOA information available and is reviewing it now for possible use.] "We are looking into supporting more ODBC drivers using our interface. It depends on user demand to some extent. Also, as we get more experience with different drivers we will make the interface more generic and robust." "You stated that 'all players support ODBC except UNIX ARC/INFO' I am afraid that I must disagree and in the strongest possible way. If ODBC was as freely available and in widespread use on UNIX as it is on Windows, believe me, we would use it." "The quote from [a name] is not wrong, but there is a world of difference from having a design and standard on paper and having something which is actually implemented in a useable and affordable way. All the standard bodies in the world can agree on everything (a minor miracle in itself), but it doesn't do software developers and end users any good if the vendors don't follow and implement the standards." "There is a chance that ODBC on UNIX will improve in the future. I think ODBC is a very well designed system. If I had my way, I would redesign DBI [ARC/INFOs DataBase interface] to use ODBC exclusively. ODBC is a perfect solution for a product like ARC/INFO. If we could use it, we would eliminate all the DBMS dependent code that is so expensive for us to develop and maintain. We would be able to satisfy all those users who want to connect to a DBMS other than the four we officially support on UNIX (Informix, Ingres, Oracle, Sybase). Unfortunately, ODBC is simply not there for us [for free] at the present time. Hopefully, this situation will change in the near future." "[A name], DBI Programmer, ESRI" Conclusions. So what interpretations can be placed on this exchange? 1) ESRI's DBI programmer likes the standardization and utility of ODBC. 2) ESRI is maintaining a lot of expensive, private UNIX-only code to maintain the interfaces between their internal INFO database and several primarily UNIX based database products. 3) It is not practical to ask all those university, non-profit, and philanthropic sites with site licenses of say $100 to $200 per chair, and many, many chair licenses, to cough up $150 per chair for the UNIX ODBC license fee. So they have to continue to maintain the private UNIX interfaces which they own. 4) But, many university and other lower budget sites are moving to NT from UNIX based servers. On these networks, these users will use Microsoft Access, Oracle for NT, and ... whose ODBC interfaces are free, and where TNTmips is less expensive and TNTlite is free. 5) ODBC is the standard, will prevail, will improve, and is being integrated into the TNT products. There has also been a well publicized competitive war going on between Microsoft and Oracle Inc. between Access on the PC versus Oracle on the UNIX platform. Oracle Inc. was not going to move very fast to support Microsoft's ODBC standard and their associated Access, dBASE, Foxbase database push. Providing a free UNIX ODBC driver for Oracle to make it easier for W95 and NT clients was not beneficial to Oracle Inc. Furthermore, on an expensive UNIX based enterprise system, a $150 per seat fee would not be considered a problem especially if the license was for "floating seats" (e.g. counted only at the time of actual use). However, Oracle Inc. has now released NT Oracle with the ODBC support which is mandatory to compete on NT. As a result, Oracle Inc. is under increasing pressure to supply a free ODBC interface for their UNIX product. Should they do so, then Sybase, Informix, and Ingres will have to immediately follow suit.
General. New objects created are now given the default names 'Newxxxxx', where xxxxx is the object type (vector, raster, CAD, TIN). The default description is 'Created in the object editor' and is defined in messages.txt in the group [tntedit] under 'NewObjectDesc'. You can change this default naming to something else (e.g. your name and address) and to some other language by editing the message.txt file. When a reference layer is changed into an edit layer via Reference/Edit, the current display settings are used (i.e. view does not change). The operations icons on the 'Edit Elements' dialog for both CAD and vector have been rearranged. The 'Delete' icon is now the standard 'X' icon and has been moved to the last icon of the operations row. The initialization for the XY digitizer is now saved in the new user preferences (TNTProc.ini) file. After the first initialization, all subsequent executions of the Object Editor will use these same stored XY digitizer setup parameters until the digitizer is reinitialized for some other reason. Selecting elements using a region-of-interest is now available for vector, CAD and TIN objects. Using a ROI to "Cookie Cut" out an area of raster objects is not yet available but is scheduled for addition. This is a new power feature available to define the complex spatial area to be edited via the polygon mask in the region-of-interest. See the detailed section below for more information on the use of the ROI object. * Elements by Area. A large or small group of vector, CAD, or TIN elements can be selected by area for a further operation such as deletion. Area selections were previously defined by using circle, square, or polygon tools. A ROI can now also be used to define a complex set of areas. These areas can select elements inside or outside by setting one of the new Region Test modes. These modes also control how the elements which cross the area's borders are to be handled. 'Partially Inside' selects all elements which are totally inside the currently defined area. It also selects all elements which cross the boundary of the area defined. Only elements which are totally outside the area are excluded. 'Completely Inside' selects only elements which are completely inside and do not cross the boundary of the currently defined area. Elements which are outside or which cross the boundary are excluded. 'Partially Outside' selects all elements which are totally outside the currently defined area. It also selects all elements which cross the boundary of the area defined. Only elements which are totally inside the area are excluded. 'Completely Outside' selects only elements which are completely outside and do not cross the boundary of the currently defined area. Elements which are inside or which cross the boundary are excluded. These modes provide additional controls for the area selection and related interactive GIS functions which can now be accomplished directly in the object editor and in the display process. In using these new area selection modes it will quickly become apparent that "Clip to Inside" and "Clip to Outside" are needed. These additional modes are planned, but are complicated by the fact that selection of partial elements requires a lot of manipulation of the attributes attached. For example, clipping (cutting) polygons at the edge of an area tool requires an extract operation be completed and a new set of standard properties be computed (e.g. new area, ...). Vector Objects. The Vector Tools window provides an Add Region icon. It will expose the Region Edit Controls window. Select a ROI from this window and use the Add button to insert the boundaries of this ROI into the vector layer being edited. The snap operation in the Element Selection dialog now allows the entry of the snap distance and distance units. Many distance units are available for selection and include the original vector coordinate numbers and screen pixels. TIN Objects. Single or multiple triangle edges, nodes, and areas can now be selected for editing operations. Using the separate validate process available to check and correct the topology of a vector object will now also generate a new basic standard attribute table. Since this process is automatically used in several other vector processes (e.g. import, vector combinations, vector extract, merge vector objects, ...) the standard table will automatically be revised as well. V5.40 introduced the use of a variety of patterns which could be applied to a large complex polygon to partition it into a grid cell arrangement of many small square, hexagonal, and other polygon shapes. V5.50 automatically creates a subobject attribute table for all these grid cell polygons which are created in the new vector object. In this table each grid cell polygon will automatically have a record with two string fields attached. These fields define the matrix position of that cell polygon in that grid in the form a1, a2, b1, b2, ... The first field has a letter value and specifies the column position. The second field has the integers specifying the row value. The two column and row coordinate fields attached in V5.50 in this fashion are of the "One Record per Element" type. But, a single record might be attached to more than one grid cell polygon. This occurs when the required grid cell is split into two or more polygons by some complex edge or island effect in the original bounding polygon. The two or more individual small polygons created in this fashion inside this grid cell will all share the same grid cell column and row coordinate record. Creation of an attribute table for the grid cell polygons has useful applications. For example, start by creating a grid cell sampling structure (e.g. quadrates) inside a number of large forest-type units (or geologic units, soil polygons, ...). Ecological, biological, geochemical, and many other observations and measurements can be collected within all, or some of these quadrates on a sample basis using a GPS unit. Enter this field data into a spreadsheet for analysis and modeling where each row of cells is headed by two cells containing a1, a2, b1, b2, ... After massaging and analyzing the sample field data in this spreadsheet, it can then be further subjected to geospatial analysis and visualization. Simply save the spreadsheet as a table(s) containing the a1, a2, b1, b2, ... column and row fields. Use the grid tools in TNTmips to "survey in" the grid cell polygons into the corresponding forest-type polygons. Use this new internal table(s) of column and row coordinates to relate the grid cell polygons to the table created in the spreadsheet. Then proceed to further process the grid cells within TNTmips. For example, from a DEM add slope, aspect, elevation, 3-D surface area, and other physiographic data in new fields added to the records from the spreadsheet, add the soil type composition and characteristics from a polygon overlay, and then move the table back into the spreadsheet for further analysis. This process can also be used in reverse where the grid cell polygons are generated and then combined with other spatial objects (e.g. slope and aspect maps, soil maps, surface area, and so on) and the related attribute table(s) exported to an external database table. The table will now contain the a1, a2, b1, b2, ... column and row record identification fields, the lat/long position, the area and volume of the surface, and the many other useful parameters of the grid cells created by geospatial analysis. The table can then be loaded into a spreadsheet where the records are subsequently associated with all the other manually collected field sample data (e.g. species occurrence, species count, and ...) for each grid cell polygon. The geospatial derived data and field sample data can then be subjected to a modeling effort and these results returned back into TNTmips for viewing and further geospatial analysis. Modifications Since V5.50 CDs. The Grid Generation process can now generate sample points within the grid cells for the center, systematically unaligned, and random strategies. A reference image or any other type object can now be displayed in addition to the vector object whose polygons are to be gridded. All the element selection procedures are now being added to choose the polygons to be gridded. A selection of 16 "scaleable" north arrows ranging from simple to fancy are now available. These new CAD objects are located in the same NORTH.RVC file as the single north arrow previously available. A color plate entitled North Arrows illustrating this new selection is attached to this MEMO. User-defined contrast enhancement is now available. This feature had been available previously before this process was rewritten for V5.30. The scanning process will also dynamically update the previous display when the contrast parameters are changed. This allows more rapid selection of suitable contrast values. * Automatic Map Tracing. (prototype process) This new process combines raster thresholding, raster thinning, raster line tracing, vector filtering, and vector thinning into a single, easily used and production oriented operation. It is designed to automate the extraction of map lines into polygon boundaries from scans of paper maps and airphotos. This first prototype is restricted to handling grayscale rasters. A discussion of its possible future expansion into color occurs below in the section Future Plans. A supplemental documentation section is enclosed entitled Auto-Trace to describe the operation of this prototype process in more detail. The Test Problem. The design task for this first test version was to permit the rapid conversion of the published U.S. soil maps into vector polygon form. This example application will also be explained here and in the supplemental documentation in more detail to illustrate what this new process does. However, many other kinds of maps can be reduced to polygons by this process. It would also be immediately applicable to extracting polygons or lines from interpretations drawn in solid color or black on grayscale airphotos or full size black and white orthophotos. Historically, soil maps of the United States have been prepared by the Natural Resources Conservation Service (formerly SCS alias the Soil Conservation Service) and are available in hardcopy format in most NRCS county or district offices. Soil polygon information is a very important variable in many new natural resource applications of GIS systems. It has always been an important variable in agricultural management, and increasingly so with the advent of precision farming, conservative private crop insurance programs, land conservation efforts, land appraisal, tax valuation, and related progressively more site-specific agro-management activities. The detailed soil surveys of the United States have been prepared over the past 60 years by soil scientists at the county offices of NRSC. They are also stored and distributed from these same county offices and at state offices. It is necessary to contact each NRSC county, district, or state office in the United States to obtain a particular county soil survey within that state. As a result, this valuable national resource information is not centrally located. These detailed paper soil maps are not available in suitable digitized form for detailed geospatial application except in a small proportion of "test" counties. After 10 to 15 years of arguing about how this digital form of the soil polygons will be created and distributed, their creation is still not underway in any serious fashion, and there is little hope that it will be completed in the next 10 years. As a result, those citizens and taxpayers who wish to use this valuable resource information have had to set about privately converting it to vector form for each county. Most counties' printed soil surveys consist of a thick bound book printed by NRCS and made up of from 50 to 200 of 11 by 17" soil map sheets. An additional large portion of the book contains the many printed tables needed to describe the characteristic of each soil type mapped in the county. Fortunately, all these tables of soil types and their many properties are already available for all the counties in the nation in database formats from individual state offices. The 11 by 17" soil map sheets in this book are the individual pieces of the soil map of the county and are usually graytone reproductions of sections of old, high altitude airphotos at a scale of 1 to 20,000. On these airphotos the NRSC soil scientists have laboriously compiled the detailed polygon boundaries of all soils in the county, and thereby in the nation. These soil polygon boundaries were originally drawn as solid black lines on separate overlays. During the reproduction process, the graytone airphotos were screened and printed in graytones, but the soil boundary lines were overprinted in a separate pass in solid black ink. Procedure. Introduction. It is possible to very reliably extract the black soil polygon lines from the printed graytone soil survey sheets by thresholding the scans of these maps into a binary raster. However, other information was also added to the black feature overlay and overprinted onto these soil survey imagemaps along with the soil boundaries. These other solid black features include the drainage added in dashed lines. Extensive black lettering was hand printed to show the soil type letter codes multiple times in each soil polygon. Section and other public land boundaries were added as well as some other minor, miscellaneous features. Batch Approach. First these soil sheets are scanned into grayscale raster objects at 300 dots per inch. Then each raster object is selected as input in this new map tracing process. One or all rasters (sheets) for a county can be selected since their original printing and subsequent scanning usually produce identical grayscale characteristics. How many are selected is determined by the drive space needed to store the source rasters and the batch time available for processing. All soil map sheets in the county can be selected for processing all at once over a weekend, or a subset selected which can be completed overnight. On a Pentium at 166 MHz, using Windows 95, each 11 by 17" test sheet in our local county (Lancaster) required approximately 17 minutes. This is the time to complete all the steps to be outlined below from starting with the grayscale raster object read from a CD, to finishing the writing of the vector object with correct topology on a hard drive. Using NT on a Pentium Pro of 180 MHz set up exactly the same, this conversion of the same series of test sheets required seven minutes to complete. Operation. Auto-Trace is located at Prepare/Convert/Raster to Vector/Auto-Trace. When it is started it will bring up a new Raster Thesholding and Tracing control dialog box and a Sample Results window. (Note, for the first use of this process, the Sample Results window defaults to be under the view window. Move the view window to expose it.) Use the Raster selection button and the familiar Object Selection window will appear. Use it to select one or more grayscale rasters to process and close it. Immediately a view window will appear displaying the most recently selected raster. Select any other input raster in the Raster Thesholding and Tracing dialog, and it will immediately appear in the view window. The Raster Thesholding and Tracing dialog has a panel to control how the grayscale raster will be converted to a binary raster. A slider is available to set the threshold value. Toggle menus allow selection of black or white thesholding, set a null value, and tracing of black or white areas. The Sample Results window already shows the binary raster produced by the default setting for the small red area outlined on the input raster displayed in the view window. The binary raster in this Sample Results area is automatically updated when the threshold is changed or the red area is dragged to a new location on the input raster in the view window. This test area and the red box can be increased by enlarging the size of the Sample Test window. Make sure the threshold and other settings produce a clean binary raster (i.e. without picking up any background image values). The same panel has a button to allow the selection of the type of output object that is to be created. The first two options are a binary raster and a thinned binary raster (i.e. a raster with lines morphologically thinned to one cell in width). Use these options if you wish to create these raster objects for use elsewhere. The fourth choice is to create a vector object by tracing the outer boundaries of the binary areas to create polygons. These three choices require no control input, and selecting the Run button will use the current general setting to process all the input raster objects into output raster or vector objects. The fourth choice or Vector Line Trace provides the option to create a vector object with polygons from the line work via current the binary raster threshold settings. It opens a new drop down panel with which to control the vector filters which will be applied. It provides a button to apply or disable the vector filter step and a slider and matching editable numeric values to set the filter's key control parameter in binary raster cells. The object of these filters is to eliminate as many other line features and artifacts as possible from the desired polygons. Note, all these filters default to off. Use the button in front of them to apply each of them. Close Gaps (End-to-End). Gaps in continuous lines may occur when the original draftsman did not draw the soil lines contiguous or wide enough. They are sought and defined as a line which continues in the general direction of the current line being tested. The maximum gap value can be numerically or slider selected. It should be set small enough to avoid closing the spaces in the dashed drainage lines. Close Gaps (End-to-Line). This filter looks for gaps which should be closed by extending the line being tested to close up to a crossing line. These kinds of gaps occur when the original draftsman did not quite close a line up to another line. This maximum gap value can be numerically or slider selected. Deleting Dangling Lines. The dangling line filter is applied after most of the drafting gaps are closed by the prior two filters. It was designed to be used elsewhere to remove small spurs or overshoots where the draftsman crossed the line when closing to it. It is reapplied here with a "no length" setting. It thus tests all lines to see if one end is floating and if true, deletes them. This removes most dashed drainage lines. It removes all "open letters". Drain lines, other lines, and most characters which touch or cross the soil polygon boundaries are also deleted as they have one end which is floating unattached. All interior lines, "open letters", etc. are deleted because they have at least one floating end as well. The closed portions of letters such as "o", "a", "A", "p" ... are not deleted. Lines such as section lines which have no gaps and are bounded (not floating) on both ends are not deleted. Remove Small Polygons. The closed portions of letters such as "o", "a", "A", "p" ... and those open letters sections which happen to close at both ends with a soil boundary or other line are still present and represent small polygons. The minimum polygon size filter can now be used to remove these areas. A slider or numerical cell count can be used to set the upper size limit of the polygons to be deleted. Set this filter area, and it will be applied after forming the polygons. Thin Lines Filtering. Lines traced cell-to-cell on a high resolution binary raster contain many excess vertices and corresponding very short line segments. Many of these tiny line segments represent noise, stair-stepping, or other micro artifacts in the raster line and can be deleted. The Thin Lines filter removes these meaningless segments yielding fewer, longer line segments. It operates by inserting straight line segments for "noisy" or very slightly curved strings of vertices. This filter tests for possible substitution until the straight line tested against the string of tiny line segments reaches a maximum offset equal to the parameter set in the slider. It then replaces the string of vertices tested with the straight line. This reduces the storage required for the vector object by at least 90%. It also proportionally reduces the time it will subsequently take to render and process this vector object in all other TNT processes. Manual Editing. After the vector filter parameters have been selected, the process can be run and will sequentially convert each input raster to a polygon vector object in the seven to 17 minutes noted. Running the process in batch mode for all 60 11 by 17" soil sheets for our local county (Lancaster) requires three hours on a Pentium Pro at 180 MHz and about four times as long on a Pentium at 133 MHz. However, nothing is perfect, and Auto-Trace simply automates one step in a larger sequence of steps. The next step could be to use the vector editor to manually correct any remaining problems in these polygons while they are overlaid in color on the original graytone raster. For example, the straight section boundaries and road lines can be very quickly deleted. Simply select each straight line segment in a view using the multiple element selection feature and delete them all at once. While in the object editor, each polygon can also be labeled by sequentially selecting all soil polygons of one type. Then key in the original soil type code still visible on the reference image or select it from a prepared table. Geometry. Some place in the end-to-end process of creating a county's digital soil map from the 50 to 200 sheets, the distortion in these printed image maps has to be removed. This can be accomplished with the original scans by warping the raster objects or later warping the vector objects that Auto-Trace has created from them. Warping the scanned raster object requires more storage space and computation time. It has the advantage that corrected pseudo-ortho image maps are created for other uses such as mosaicking. Adding control points for this warping process (raster or vector objects) can be quite adequately accomplished if accurate section corner positions are available from some source such as a GPS survey. Significantly less accurate section corners digitized from 1/24,000 USGS topographic maps can also be used if their accuracy projected into the final soil map is adequate. Use the georeference process, and find the position of each section corner on the scanned raster object, and assign them their correct coordinates. Then warp these image maps (sheets) to the desired map projection. Since topographic relief in most agricultural soil map areas is not large, the resulting new raster object will not have a large error due to topological relief displacement. Note that the section boundaries are not removed in the Auto-Trace sequence. Undoubtedly, some kind of straight line filter could be derived to remove such straight line segments over a specified length. But, these lines and the intersections they define are available to use as control point to adequately correct the geometry of the vector object for each 11 by 17" sheet to the desired map projection using the warping process. In fact, vector points representing the measured section positions can be automatically snapped to and associated with the section line intersections in the vector object produced by the Auto-Trace process outlined above. Finally, since the vector object representing each sheet is now georeferenced, all of them can be merged to form a vector object of all of the county or some smaller map quadrangle units. Sample Run. A raster object in the Crow Butte standard sample data set distributed with TNTmips can be used for a sample run on this process. Locate the Project File CB_11X17.RVC on the hard drive or install it from the V5.50 CD. Select the raster object THRESHOLD to be converted to a vector object. This object contains a single 11 by 17" soil sheet which has been scanned at 150 dpi (not optimal) and previously converted to a binary raster. Other than the separate conversion to binary and its lower resolution, this raster object is identical to the grayscale scan of the original U.S. soil map sheet. Note in the view window in Auto-Trace that it has many hand-printed soil type labels and other annotations, dashed drainage lines, section lines, and other artifacts all still represented in wide raster lines. Process this raster object through Auto-Trace using the following parameters. Set the Threshold = 0. The input raster only has values of 1 and 0 and will simply be thresholded again with this value producing no changes. Set up to produce a Vector Line Trace; Binary: 0 (black) value Below Threshold; Output Null As: 0 (black); Trace Value: 1 (white). Now use all the vector filters with their parameters set as: Close Gap (End-to-End) - Distance = 10; Close Gap (End-to-Line) - Distance = 6; Remove Dangling Lines - yes; Remove Small Polygons - Size = 100 cells; and Thin Lines = 1.0. Run this test creating a new vector object in five minutes on a Pentium. Inspect this new vector object in the normal display process. Modifications Since V5.50 CDs. The Auto-Trace now has a slider to control the length of the Remove Lines filter. This sets the maximum length of the spur or isolated line which will be deleted. This is no longer called the Dangling Lines filter as it also removes isolated lines under this length. The process now also removes "bubble" polygons. Bubble polygons are those which are made of only two or three lines. Bubbles can occur in the binary raster in wide lines or where a line gets thick and bubbles out. Where this occurs, the thresholding step has put holes inside these wider lines which ultimately produce spurious bubble polygons. The bubble filter tests the size of a bubble polygon set by a slider. If a bubble polygon is under the specified size, the bubble polygon is deleted by removing the longest line making up the bubble. If a bubble is removed, then the lines removed can generate new bubbles which are in turn tested for and removed. This will then delete a "nest" of bubbles which can occur when a complicated "blob" appears in the binary raster. Future Plans. The grayscale Auto-Trace process will be useful to the sub-group of TNTmips clients with grayscale or binary scanned materials, especially soil maps. It has been assembled as an experiment in a new level of integration of pieces which already exist in TNTmips. If enough other clients request it, this process can be expanded to be used to extract color lines and polygons (e.g. contour, bathymetry, roads, ...) directly into a vector object from color scanned materials. Many new features and processes are requested by new and experienced MicroImages clients. Which ones are added and when requires a judgment call as to the number of clients who would benefit. In V5.50, the separate neural network color binarization process can already be applied to extract all the lines of a selected color into a binary raster (e.g. contours). This binary raster can then be input to Auto-Trace for batch conversion to a vector object. This vector object can then be filtered and cleaned up in the object editor while overlaid atop the original color scan. A sequence of separate vector objects can be created in this fashion and subsequently combined into a single vector object using the vector combination processes. This takes a lot of steps, and some of them could be brought together in this new process. Expanding the Auto-Trace process to handle color materials would require integrating into it the color binarization process (now available for both lines and polygon areas). Color binarization would be used interactively like thresholding to set up the process to isolate the binary raster containing the color features or map theme of interests. After each color extraction model is interactively designed for each map theme, the appropriate filter types and their control parameters could be selected and interactively tested on the binary raster which would result for that theme. Conversion to polygons, lines, or points each needs a separate and expanding collection of vector filters. For example, the final pass to extract small polygons representing a blue lake area theme would not apply the small polygon area filter and would create closed lake polygons in the vector object. But, a pass to extract the red road line theme could apply a line "straightness" test and create only road lines in the composite vector object. The sequence of steps to setup to extract a color theme could be iterative so that pass could be designed for each specific color (one for brown contours, one for blue bathimetry, one for green vegetation polygons, ...). The batch conversion step would then sequentially use these steps or passes to convert the original scan into a binary raster, retain its identity (e.g. contours), and process it with the selected filters into composite or separate vector object(s). It is common to select a large collection of elements to extract by using the "By Element" selection dialog. However, it is also often necessary to show some other object(s) to be used as references in order to identify the elements to be extracted. A Layer Control dialog box has been added which will now be exposed to allow these reference layers to be set up and viewed while selecting elements. These layers can be controlled via the 'Options' menu which provides all of the same options that are in "Display Spatial Data" under the view windows options menu in other processes. A simple example will illustrate this kind of application. A United States' county boundary vector object (no state boundaries) is displayed over a state boundary reference vector object. This allows those border counties along irregular state boundaries (e.g. Montana bordering Idaho) to be selected for extraction. It is common to display a complex vector object over a reference road map and image and use these as references to select the elements to be extracted. For the region selection dialog, a layer control dialog has been added to allow reference layers to be manipulated. The 'Options' menu on the view window now contains all of the options that are in "Display Spatial Data" under its view window options menu. A minimal standard attribute table is now automatically generated at the end of the extract process after the new vector object is validated. Considerable additional effort has gone into improving the reliability of all these vector combination functions especially when applied to very large vector objects of 10 to 100s of megabytes. The Logical AND (alias Intersect) and Logical Exclusive OR (alias XOR) cases have been especially improved. Vector elements can be selected for any analysis by using the "By Element" selection dialog. However, it is also often necessary to show some other object(s) to be used as references in order to identify and select the elements to be extracted. A Layer Control dialog window has been added which will now be exposed to allow a reference layer to be set up and viewed while selecting elements. These layers can be controlled via the 'Options' menu which provides all of the same options that are in "Display Spatial Data" under the view windows options menu in other processes. A minimal standard attribute table is now automatically generated at the end of the extract process after the new vector object is validated. As in ARC/INFO, the TNTmips vector combination and related processes are scattered around in various processes for convenience. For example, there is a separate Extract process. A printed color reference chart entitled TNTmips Vector Analysis Operations has been prepared and is enclosed to help in the location and use of these powerful vector manipulation procedures. This chart also graphically shows what happens to polygons in each analysis. For those who already have some acquaintance with ARC/INFO, the chart also contains the name of that system's equivalent computer program when one is available. The 'Vectors...' button for selecting the vector objects to merge has been replaced with 'Select...', 'Remove', and 'Remove All' buttons to allow better control of the list of vector objects to merge. A minimal standard attribute table is now automatically generated at the end of the merge process after the new vector object is validated. * Region-of-Interest.
(prototype feature)
IMPORTANT: The use of Regions-of-Interest (ROIs) provides direct and immediate power applications in interactive geospatial analysis. This concept is a very significant step forward in the evolution of the TNT products. The idea is built upon many unique features built into the TNT products any finally makes them accessible in a graphical and interactive framework. Please take time to study these first descriptions on the color plates
provided on the use of ROIs. The immediate applications in the area of GIS
are obvious. The applications in the area of image analysis (IPS) are
less obvious, but no less important. More obvious applications in image analysis
will be realized as the image oriented region definition tools are released in V5.60. Background. Geospatial analysis must mix and use a variety of geographic data structures to accomplish advanced objectives. This mixing idea occurs throughout the TNT products as this approach has been one of the fundamental design concepts. Examples of the use of this idea occur in:
However, all these operations use predetermined approaches for the use of these mixed data structures. Object-In/Object-Out (OI/OO) processes allow complex "batch-like" geospatial analyses of similar and mixed object types in the TNT products. Using the Vector Combinations process with query control is an example. Query control is commonly used in both simple and complex applications. But, it takes time and experience to design queries. Thus, alternate interactive methods are gradually being implemented and released. Some methods are already available in the TNT products for interactively analyzing similar geospatial data structures (i.e. similar object types). The "on-the-fly" conversion of raster objects from different sources and map projections and their fusion is one example of what is already available in TNTmips. But, it should not be necessary to use the OI/OO approach to prepare a new vector object from two other vector objects to find the elements they have in common in some subarea. An example would be showing the information records for the water wells in one vector object which fall inside a soil polygon selected from another vector object. Interactive combination of geospatial data becomes more complex when it requires the use of objects of varying data structures (i.e. several mixed type TNT objects). For example, it is common to want to display the points stored in a database object (not a vector object) for a selected polygon(s) stored in a vector object. This could have the same objective as above of displaying water wells and their attributes. But, the well records reside on a file server in an ODBC linked Access database object maintained by someone else. There should be no need to import these database records into a vector object to extract and show them for the soil polygons or other areas of interest. The Region-of-Interest (ROI) concept is one of the planned new interactive means of defining and asking inter-object questions in the TNT products. ROIs can currently be created and used in the Display process and Object Edit process. This concept provides a variety of means by which a complex area-of-interest can be quickly and interactively selected using features in one object and applied immediately to perform a function on another object. This ROI can also be saved as a new object type for use with any vector, CAD, or TIN object in a subsequent application. Applications to raster and database objects are also in preparation. Operation of the ROI features is outlined by the initial supplemental documentation enclosed entitled Using Regions. Also a color plate entitled Creating and Using Regions for Selection is attached to illustrate the results of the application of the concept in a demographic analysis. An additional detailed natural resource sample exercise is provided below. Repeating this exercise will quickly illustrate the power of this new geospatial analysis tool provided by V5.50. Definition of ROI. General. A Region-of-Interest (ROI) is a new, but simple object used to define a complex geographic area-of-interest. It is a collection of the area of one or more polygons with or without islands. An island is a polygon of some type different from its enclosing polygon. Islands at any level of nesting are included within the area of the ROI if the criteria selecting polygons would include them if they were not an island. In other words, if an island is selected when a ROI is created, then its area is included in the ROI. As a corollary, the first island can be excluded from a ROI if it is not selected while the second island within it is included because it is selected. A ROI is made up of areas defined by polygons, thus it is scale independent. It thus differs significantly in this respect from a binary raster mask which must match in scale and cell size the raster object to which it is being applied. The ROI has an implied projection, and that projection is stored with it. A ROI is considered to be georeference independent, which means that any coordinate conversion for ROIs and the objects they are used on are handled internally. There are no lines, point, polygon identification, database tables or other attributes stored with the ROI object. Active ROI. A single ROI created for immediate use is called the active or selected ROI. If it is the only ROI created during the use of the process, by default it will be the ROI available and used anywhere in the process. For example, it will be automatically used anytime the ROI icon is used to select an area from any layer in the Display or Object Edit processes. The active ROI is not named or saved when the process is closed. Temporary ROIs. Many internal temporary ROI objects can be created and used within a process. But, there may be no reason to permanently save all or any of these temporary area constructs for future use after the process is closed. They are simply interactively created, temporarily stored, applied, and then forgotten. But they do stay around for possible reuse during the duration of the process unless deliberately deleted. Each temporary ROI will be named using defaults and saved as an internal object when created by the process. The names of these temporary ROI objects appear in a list in the Regions Edit Controls window. The last ROI created will be the active ROI unless another is made active by selecting it in this list. All these temporary ROIs will be lost unless the Save As icon is used to save each of them one-by-one. Saved ROIs. Unless the active ROI is named and saved as an ROI object, it will disappear when the process is closed. It can be saved by using the new Regions icon to expose the Region Edit Controls window. The Save As icon in this window will allow the active ROI object to be named, described, and saved in any Project File just like other objects. At any time a temporary ROI can be made the active ROI to save it by selecting it in the Region Edit Controls window. An ROI object which was previously saved to a Project File can be recalled at any time for use by the Display and Object Editor process. Recalling one or more saved ROI objects converts them into named temporary ROI objects for the duration of the process. They then function just as any ROI which was actually created in the process. To retrieve and 'reload' a ROI object in Display, use the red arrow Select icon in the view window to expose the Element Selection dialog box. Use the large Edit Elements... icon in the Object Editor to expose its Element Selection dialog box. Then in both processes select the Region tool icon. This exposes the Region Edit Controls window and the list of any existing temporary ROIs already available (e.g. previously created or loaded). The Add icon in this window will expose the familiar File/Object Selection window. Navigate as usual to correct Project File and a saved ROI object. Select this object as usual and this saved ROI will be loaded for use as a temporary ROI. Prototype ROI. With the exception of the polygon fitting option (discussed below), in V5.50 an active ROI can only be created by selecting polygon elements from one or more layers. That is to say, all ROI boundaries will correspond to some previously prepared vector polygon boundary segments. However, it is still easy to create a ROI in the Object Editor from features in a raster layer (e.g. draw timber type polygons over an airphoto). Simply create a new vector layer over the raster, draw polygons in this layer, and then select from these to define an active ROI. This ROI can be immediately applied in the Editor to select vector elements from some other vector layer (e.g. timber type polygons selecting overlapping soil polygons). Obviously it would be easier to simply directly draw and edit a series of polygons and then decide to immediately use them as the active ROI. Another need is to edit the polygon boundaries in any existing previously created ROI. This and other interactive procedures will begin to become available in V5.60 to create ROIs by means other than the selection of existing polygons. Direct drawing, buffer zones, flood filling, and regions growing are all examples of techniques which can be added to interactively create ROIs. These planned extensions of the concept require the use of a prototype ROI which is one that is being interactively shaped or designed in some tool, but is not yet complete and is also editable. When the areas inscribed are satisfactory, an OK button will then convert the prototype ROI to the active ROI for immediate application in selection, saving, etc. An example of the future use of the prototype ROI would be to use an interactive buffer zone function. It would draw the buffer zones as a prototype ROI. If they are satisfactory, use the OK button to make them the active ROI. If they are too wide, set a new distance, and recompute, automatically erasing the existing prototype ROI. Similarly, a multiple polygon drawing tool could be used to simply draw multiple polygons around image features which the OK button will immediately convert to a ROI. As a corollary, selecting the same multiple polygon edit tool will make the active ROI into the prototype ROI which can be edited and then converted back into the active ROI by the OK button. Creating ROI by Selection. ROIs can be created in the Object Editor from any vector objects being edited or used as a reference layer. A ROI can be created and saved from any selected polygon elements at any time using the Element Selection dialog box. The set of polygons to define the ROI can be selected sequentially, one at a time; by the rectangle, circle, or polygon area selection tools; by a query; or even by the current active ROI. All the area selection tools now react to new Region Test mode settings added in V5.50 to the Element Selection dialog of 'Completely Inside', 'Partially Inside', 'Completely Outside', and 'Partially Outside'. The impact of these mode settings are described in more detail above in the Object Editor section and their application illustrated below. These mode settings determine whether or not elements which cross the boundary of the area selection tools (including the ROI) are in or out of the current area selection. It will become immediately apparent that the additional area selection modes of "Clip to Inside" and "Clip to Outside" are also needed. These additional modes are planned, but are complicated by the fact that selection of partial vector elements requires proper maintenance of the attached attribute tables. For example, clipping (cutting) polygons at the edge of a ROl requires an extract operation be completed and a new set of standard properties computed (e.g. new area, ...). Similarly, when a road is trimmed by the ROI, its length and other properties change. Tools to manage these changes already exist elsewhere in the equivalent Object-In/Object-Out (OI/OO) operations in TNTmips such as the Vector Combinations process. These will need to be integrated into the use of the equivalent ROI applications. Using ROIs in Display. The following example illustrates the interactive geospatial analysis now available in the Display process (thus also in TNTview) by creating and using two simple ROIs. This same exercise can be accomplished in the Object Editor. However, significantly more complex applications can be completed by using the additional powerful capabilities for:
The objective of this Display exercise is to totally interactively obtain a look at all the drainage lines which pass through any soil type polygon contained entirely within several sections of public land in the Crow Butte map quadrangle. This is a hypothetical question but suffices to illustrate the use of the ROI and the procedures. The exercise would become much more meaningful if a fourth object were available constituting the polygons for the same map quadrangle which show the areas of a chemical pollution spill. Generally the activities to be completed in these exercises will use the Crow Butte sample objects for a single 7.5 minute map quadrangle which will all conveniently fit into the view window. However, remember this is not a WYSIWYG (What You See Is What You Get) exercise. Throughout the procedure, much larger objects could be used interactively to achieve the same results over a much larger geographic area. The final selected features, attribute tables, vector object created (in the Object Editor), and other results apply to the entire study area. The only difference in procedure might be the need for scrolling the view periodically and using a faster computer. Start by displaying the public lands, soils, and hydrology vector objects from the Crow Butte sample Project File provided for use with each TNTmips, TNTview, or TNTlite. These objects can be located for display as follows:
Display the Objects. Display all three of these vector objects in superposition in any order. Make sure their vector elements are selected as follows using the Vector Object Display Controls dialog: for public lands show all polygons, for soils show all polygons, and for hydrology show all lines and polygons. It does not make any difference for this exercise which additional vector elements show as long as these do. Expose Selection Dialog Box. Use the red arrow Selection icon in the view window to open the Element Selection dialog box. Use the radio selection button next to Selection Parameters option to expose the panel containing the Group 1 icon. Use the Group 1 icon to expose the three icons for the vector layers currently displayed. Use each of these three icons to expose each detailed subpanel of selection control icon bars for each layer or object. It would be convenient at this point to enlarge the vertical height and width of this Element Selection dialog box so that it shows everything within it. Expand it over any other window except the view window, since these two are almost all that is needed. Set Initial Conditions. The defaults of these three objects will have
turned-on or turned-off some previous selection of the types of vector elements
(lines, polygons, ...) they contain. IMPORTANT. Set the red arrow icon at the beginning of each and
every vector element type to "off"
for every icon bar for all three layers. None of the red arrow Selection icons
should be pushed. All of them should have a bright (not dimmed) background. Test
this by trying any of the interactive selection tools, as none of them should be
available for element selection. Check the Setup. Check to see if things are set up correctly to this point. All three layers should show in the view window. The complex looking Element Selection window should be exposed with all element type selection icon bars showing for all three layers. No elements are selected in the view window. Elements cannot be selected using the tool and mode icons. This may all seem complicated just to get to the point of starting the exercise. But, flying a plane and doing geospatial analysis both require a preflight checklist. The above steps are needed to initialize things. Failing to do so will cause this ROI exercise to produce results different from those to be described. All the above is only a few mouse clicks to anyone already familiar with the use of the previous powerful selection tools in the TNT products. Making a Public Lands ROI. The first ROI to be created in 10 mouse clicks is that which defines the sections of public lands of interest. 1) Press the red arrow Selection icon at the beginning of the public lands polygon tool bar. This will make these, and only these, public land (i.e. complete section) polygons selectable. If this particular red arrow Selection icon is gray, then way back at the very start, polygons were not selected for display for this layer. Note, it may still look like polygons are available if only lines were selected! 2) Push the Box icon in the area selection tools at the top of the dialog box. This will be used in the view to inscribe the sections to define the ROI. 3) Use the toggle menu to set the Region Test area to "Completely Inside". This will cause the box tool to select only sections whose polygon area is totally inside the current box. 4) But, before making any selection box, let's select two areas of disjoint public land sections to make up this ROI. Push the Select icon for the selection mode at the top of the dialog box. This will allow the box tool to be opened twice (or more) in the view window to select two sets of public land sections. 5) Pull out the box tool on the view around the first collection of the public land squares (probably green). Try four squares (2 by 2 miles) in the upper area of the view. When the box inscribes the sections, push the right button to select them. The selected public land polygons are in red. If they are not satisfactory and need to be changed, use the Select/Deselect All icon available in the public lands polygons tool bar to deselect them. When satisfied with the selected polygons in red, immediately use the box to inscribe a second set of separate sections, and push the right button to add them to the current set of selected polygons. Try a second group of four squares in the lower area of the view. At this point two sets of four squares are in red. To this point the procedures may be familiar, as only the selection tools previously available in V5.40 have been used. 6) Convert the selected polygons to an active ROI by pushing the new Create Region icon available in the public lands polygons tool bar. This will expose an Edit Object Name and Descriptions window so that this active ROI can be named and saved as a temporary object. Push the OK button to accept the defaults. A temporary ROI named PLANDS is now available. Note that the Create Region icon is only available at the end of the polygon element type tool bar for each layer. ROIs are areas, and can only be created from polygon elements and not from line, node, or other vector elements. 7) Push the Select/Deselect All icon available in the public lands polygons tool bar to deselect them. This is not mandatory but will deselect the red areas and clean up the screen. Making a Soil ROI. The next ROI needed will contain all the area of all soil polygons which occur completely inside the public lands of interest (i.e. inside the PLANDS ROI). This will take eight mouse clicks. 8) Nothing further is to be selected from the public lands layer, so press its red arrow Selection icon for the public lands polygon tool bar to turn this layer off for selection. At this point the initial state has been restored and nothing can be selected from the view. 9) Press the red arrow Selection icon for the soils polygon tool bar. This will make these, and only these, soils polygons selectable. If this particular red arrow Selection icon is gray, then way back at the very start, polygons were not selected for display for the soil layer. Note, it may still look like polygons are available if only lines were selected! 10) Push the new Region icon in the area selection tools at the top of the dialog box. This will expose the new Region Edit Controls window. Its scrolling list contains one temporary ROI named previously by default as PLANDS. The last temporary ROI created or restored is automatically selected each time this window is opened. The selected ROI is also automatically loaded as the active ROI. Thus PLANDS is now the active ROI and can be used as a tool to select soil polygons. This active ROI will function like the box tool in step 2) above and is outlined in cyan. 11) The toggle menu to set the Region Test area should still read "Completely Inside". Do not change it. This will cause the active ROI tool to select only the soil polygons whose area is totally inside its area. 12) While it is not necessary, to avoid confusion, push the Exclusive icon in the selection modes at the top of the dialog box. This will avoid inadvertently applying the active ROI twice in the same place. 13) Push the Select button in the Region Edit Controls window. This applies the active ROI to the Soils polygons and selects in red all of them which are totally inside this PLANDS ROI. Note that at this point the experienced operator can expose all soil attribute tables for just these soil types contained completely within the PLANDS ROI. 14) Convert the selected soil polygons to the active ROI by pushing the new Create Region icon in the soils polygons tool bar. This will expose an Edit Object Name and Descriptions window so that this active ROI can be named and saved as a temporary object. Push the OK button to accept the defaults. A temporary ROI named CBSOILS is now available. Note that the Create Region icon is only available at the end of the polygon elements tool bar for each layer. ROIs are areas, and can only be created only from polygon elements and not from line, node, or other vector elements. The Regions Edit window has remained exposed. The CBSOILS ROI name now appears in it, and this is now the active ROI. Remember that the area of the CBSOILS ROI is all the area of all the soil polygons inside the PLANDS ROI. The boundary of the active ROI equals only the outer boundary of all the soil polygons included completely inside the PLANDS ROI. Any boundaries held in common between soils polygons inside the PLANDS ROI have been dissolved. Thus the view now shows the active ROI as a bounding outline (as the "twos compliment"). 15) Push the Select/Deselect All icon available in the soils polygons tool bar to deselect all elements and push the Redraw icon. This is not mandatory, but will deselect the selected red soil polygons and clean up the view. Selecting the Drainage. This CBSOILS ROI contains all the area of all soil polygons which occur completely inside the public lands of interest (i.e. in the PLANDS ROI). This new active ROI can be used as an area tool to select all the drainage lines which pass through these soil polygons. This will take four mouse clicks. 16) Nothing further is to be selected from the soils layer, so press its red arrow Selection icon to turn this layer off for selection. At this point the initial state has again been restored and nothing can be selected from the view. 17) Press the red arrow Selection icon for the hydrology lines tool bar. This will make these, and only these, hydrology lines selectable. If this particular red arrow Selection icon is gray, then way back at the very start, lines were not selected for display. 18) Use the toggle menu to set the Region Test area to 'Partially Inside'. This will cause the active ROI tool to select only drainage lines that pass through the soil polygons comprising the ROI. 19) Note that the temporary CBSOILS ROI is still selected in the list in the Region Edit Controls window and is thus the active ROI. Push the Select button in the Region Edit window. This applies the active ROI to the hydrology lines and selects in red those contained at least in part inside the CBSOILS ROI. This exercise shows the kind of complicated interactive geospatial interrogation which can now be accomplished with these new ROI tools. A minimum of 22 quick mouse clicks has formed and executed the query. Once understood, these 22 mouse controlled steps starting from the initialized display conditions in the Element Selection dialog require one minute to complete on a Pentium at 133 MHz. But, these steps have accomplished a lot of geospatial analysis even though they were completed entirely in the Display process. Selecting the Ponds. Now try a variant of the above exercise to display only hydrology polygons within the CBSOILS ROI. This can be interpreted as showing all the farm ponds, impoundments, wetlands, or other hydrology polygons in the selected soils and public lands. Note that there are not many hydrology polygons in this vector layer. Thus, it is possible, depending upon the particular area of the CBSOILS ROI, that this ROI may not find any polygons to select. 20) Push the Select/Deselect All icon available in the hydrology lines tool bar to deselect them, and push the Redraw icon. This will deselect all the previously selected red drainage lines and clean up the view. 21) Nothing further is to be selected from the hydrology lines in the hydrology layer so press its red arrow Selection icon to turn this element type off for selection. At this point the initial state has again been restored, and nothing can be selected from the view. 22) Press the red arrow Selection icon at the beginning of the hydrology polygons tool bar. This will make these, and only these, hydrology polygons selectable. If this particular red arrow Selection icon is gray, then way back at the very start, hydrology polygons where not selected for display. 23) Use the toggle menu to set the Region Test area to "Completely Inside". This will cause the active ROI tool to select only the hydrology polygons whose area is totally inside the CBSOILS ROI. 24) Push the Select button in the Region Edit Controls window. This applies the active ROI to the hydrology polygons and selects in red all of them which are totally inside this CBSOILS ROI. Design Another to Show Attributes. A common ecological/environmental oriented question of the past few years has been to identify all the soil types in a legally defined area such as a county which contain wetlands. The use of ROIs as an interactive means to query to answer this question can be experimented with using these same three layers. Try this as an exercise using the same general steps outlined above but modified as follows. First make a PLANDS ROI as outlined in the detailed example above. Now make a HYDROLOGY ROI by using the PLANDS ROI to select all wetlands inside the selected sections (i.e. the hydrology layer's polygons in this case). Hint, use the 'Completely Inside' area selection definition. Then use the HYDROLOGY ROI to select all the soils which contain these wetlands. Hint, use the 'Partially Inside' area selection definition. Use two more mouse clicks to expose the physical attribute table for just these special selected soil types for the specified public lands. These are the hydroid soil types which may contain other unmapped intermittent wetlands which might also be worth protecting. Hints, use the Show Tables icon, and the soil properties are in the soils LAYER table. The defaults on the tabular view window may be set to single record viewing. Use the View Selected Element Records icon to show values for all the selected soil types. At this point a medium complex geospatial question has been answered in TNTview with just a few mouse clicks and no queries. Of course, blending saved, normal queries with ROIs will make it possible to interactively achieve much more complicated analyses. Creating Vector Objects from ROIs. The polygon boundaries in the active ROI can be inserted into any editable vector object (new or existing) in the Object Editor. First create an active ROI. For example, make a new vector layer and trace into it the polygons from a raster layer. Then select all or part of these polygons to become the active ROI. Switch to edit the vector layer you want the ROI added to. Push the Add Region icon in the Edit Tools dialog box, and the active ROI will be inserted into this vector layer and the topology reconciled as usual. Creating ROIs from Vector Objects. All the polygons in any vector object can be immediately used to directly define a ROI. When the Region Edit Controls window is open, an Add icon is provided. Choosing it will expose a the File/Object Selection window. It is normally used to locate and select a saved ROI object. However, it can also be used to navigate to, and select any vector object. This vector object's name will immediately appear in the region list, and the aggregate area of all its polygons will become the active region. This ROI can then be used just like any other active ROI including saving it. Lost ROIs. When two objects are selected for use in the Display process which are widely geographically separated, they produce a blank view window with tiny separated layers which may even disappear as white dots at opposite corners of the view. In other words, the normal view will automatically rescale and reposition to accommodate these widely separate, non-overlapping objects. This will not happen if a ROI is selected which is widely geographically separated from the current layers being viewed. The view is not rescaled, zoomed, moved or otherwise altered if the ROI does not overlap any of the layers used in it. This is because nothing can be done or selected with this ROI. Another lost ROI effect can occur when the ROI has a polygon which encompasses and contains all the current view. This is easy to do when zoomed way into the view. In this case, the ROI boundary is outside the view and nothing of it will show. However, if this ROI is used to select elements from the view, many or all of them will be selected. An example of this big region effect is easily constructed. Display the Crow Butte vector object and zoom into it 4X. Then select the Crow Butte public lands object as the active ROI. Nothing will show of this ROI as it is the outer boundary of all the complete sections (polygons) making up the same map area. Using this active ROI to select soil polygons which are 'Completely Inside' will select almost all of them. Polygon Fitting. The only non-polygon based ROI creation tool in V5.50 uses polygon fitting to derive areas. It can be applied to a collection of points selected in a vector layer. Selecting points will make the Polygon Fit icon active in the new icon line in the Element Selection panel. Push it and a new panel will be exposed to allow the selection of the particular polygon fitting method and to set its parameters. The OK button will fit the polygons and immediately create the active ROI. For the time being, the Regions Select button in the Regions Edit Control window must be pushed to see the polygons created. This feature was inserted at the last minute and is not easily applied. For example, each polygon fit action will result in an active ROI. The fit should only generate a prototype ROI for immediate replacement if the fit type or control parameters are changed. It is also inconvenient to use since the default settings and fit method are not retained. Future Plans. The new icon line in the Element Selection panel used to save regions in the Object Editor process (Generate Regions) has three icons showing some future plans. The first one is almost always on in V5.50 and forces the active ROI to be defined by the currently Selected Polygons. The middle icon will allow a prototype ROI to be computed with the polygons fitted around a selected set of individual point elements (home ranges). The third and always gray inactive icon will allow the creation of a prototype ROI from the area of the buffer zones computed for the currently selected points, lines, or polygons. It will expose a dropdown panel to set the buffer zone parameters. At any point during their design, they can be converted to the active ROI by an OK button which will also close this special panel. These and other kinds of area definition methods will interactively compute prototype ROIs which can then be immediately applied. ROIs can also be derived by the application of the auto-bounds functions to raster objects. Feature mapping via auto-bounds can already produce a series of vector objects of identified surface materials. These "material type" vector objects can already be immediately used as ROIs. New "region growing" functions are being designed for rasters to support interactive area selection of regions from rasters. Finding view-sheds, areas-of-view, and flood filling are all examples of raster region growing processes which could be modified to interactively define prototype ROIs. Many other advanced applications are possible, but the challenge is to create a manageable, understandable user interface and maintain the attached attribute tables.
MicroStation. Quarterly activity to correct errors is usually not reported here. However, this alteration to correct a "perceived" error is worthy of special note because "it was not our error". For several quarters there have been complaints on the import process for MicroStation DGN files. Careful checking of the DGN format documentation indicated nothing was wrong with our process, and many clients' files did work correctly while others reported consistent failure. Eventually enough of these requests and sample cases accumulated to show the pattern that they were all for the southern hemisphere. This enabled MicroImages to locate the problem as an undocumented negative coordinate reference which has been figured out and altered in the import and export of DGN files. It is not possible upon import to recognize this special condition from anything provided in the DGN file header information. It is thus necessary to identify to the this procedure in the Import process that the DGN file is signed or unsigned. This condition cannot be tested for in the coordinate data as there might be only positive coordinates in a signed DGN file. Thus, a new toggle button is now available next to the description Allow Signed Coordinates (toggle if data illegible). If it is not clear which mode to select, leave the default selected (i.e. unsigned coordinates). This will ensure that unsigned DGN format is correctly handled. However, if this gives a garbled CAD object, repeat the import process with this new signed coordinate selection. A company can declare its formats to be public, and then be sloppy about documenting its features. Or, it is also possible that older, earlier documentation is being used. However, in this case, other software has been encountering undocumented features in DGN as indicated in a recent incoming support request from Australia. "Has the problem with importing DGN files been resolved. With the batch of DGN files that I had last time, I found the only system readily available which would read these files was arcinfo. All other systems which claimed to read them had the same problem as Mips." This is the problem in TNTmips import which was fixed with the new signed/unsigned option. This Australian client has subsequently reported that all the DGN files now import into TNTmips without problems. MicroImages would like to note that our import/export procedures reflect 10 years of experience. However, more and more of the problems in using them are due to the use of undocumented features in the test files provided to us. Usually these come from third party software products which claim to create the geodata in some standard format. However, as in this case, the originator of the format has "hidden" or adds undocumented features in what is stated to be a public format. DLG Export. Questions have been raised regarding TNTmips export of more than 500
vertices in a line into the USGS's DLG
format. DLG files which have lines (arcs) made up of more than 500
vertices cannot be imported into PC ARC/INFO. This is a
significant and easily exceeded limit in long lines (arcs) which are digitized
in stream mode. It is not a limit in the DLG specification or TNTmips
which have no limit to the vertices in a line! This has recently been confirmed
for the DLG format by direct contact with USGS. The 500 point
limit is imposed by PC ARC/INFO, and this is clearly stated in
its manual. There is also a limit of 5000 lines (arcs) per polygon. From
less specific information, it also appears that these same limits also apply in
the workstation versions 6.xx of ARC/INFO. It is not clear at this time
if these limits still exist in the 7.xx versions of the workstation and NT
ARC/INFO products. LATEST FLASH: ESRI has just acknowledged to a mutual
client that this 500 vertices limit exists in PC ARC/INFO. They have
indicated that the product will be modified to lift it to 10,000 points. No
further information was provided regarding this and the related limits in
ARC/INFO 6.xx or 7.xx. Somehow this would seem to lead to the logical conclusion
that PC ARC/INFO has seldom been used for any robust GIS activity
if this limit is only now being addressed. MicroImages has also obtained a
software package for the Sun workstation from USGS called PROSIX which
they use to verify the incoming topology of ESRI files prior to their
conversion to the DLG format. Coverage File Import. Workstation (and probably NT and PC) ARC/INFO version 7.0x made minor modifications to the coverage files. These changes have been adjusted for so that the most current and previous versions of the coverage file can be imported. Vector Products Format. The VPF format used for the Digital Chart of the World can now be imported on the Sun, SGI, DEC, and IBM UNIX based workstations. * Vector Batch Export. The export of vector objects can now also be controlled by command-line parameters in a similar fashion used previously for raster export. This allows the export or vector objects via a batch file. Similarly, an SML process which creates a vector object(s) can create it in another vector format by including the commands to export it. AtlasGIS's Files. The export of CAD objects to the *.BNA format has been perfected. * GeoTIFF. The GeoTIFF standard for storing georeferenced information has now been supported during import and export to TIFF rasters. Due to the extensive list of projections and coordinate systems in this standard, not all are currently available. If you are working with GeoTIFF data in a projection not currently supported by TNTmips, please contact MicroImages software support and provide sample file(s). > Subsetting Raster Files. TNTlite can now extract any size up to 512 by 512 pixel raster objects from the PCX, GIF, and GRASS. The following is the new composite list of raster formats which can be subset during import. TNTmips Pro users can use these added capabilities to extract any sized subset during import.
These raster formats cannot be subset during import. This feature will be added for these imports as needed and as time allows.
Modifications Since V5.50 CDs. Import and export processes are being added for the for the ERDAS Imagine 8.x project file *.IMG format. This new format seems to closely imitate that of the TNT products' Project File in using tiling and pyramiding of its raster objects (see enclosed ERDAS letter to the editor of GIS World). It is even being referred to as a project file. Of course, their project files do not contain topological vector, CAD, TIN, and database objects. MapInfo's MIF/MID datasets can now be imported and exported into a vector object. Topology is built as part of the import process. This process now generates a vector polygon boundary for the standard basins. Standard vector flow paths can also be saved as vector lines. A vector trace of flow accumulations is also computed. During this procedure outlets, inlets, branches, and basins can be defined. These use threshold setting which are the number of cells flowing into an accumulation raster. * Surface Modeling. (prototype process) What is it? Earlier versions of TNTmips provided surface fitting, contouring, TIN formation, and related surface modeling procedures scattered around as several separate processes. The creation and use of these separate and independent processes over the past couple of years has gradually focused clients and MicroImages on the importance of further software development in this direction. More and more projects involve the use of DEM-like geodata for orthoimage production, surface measurements, mineral exploration, terrain analysis and trafficability, physiographic inputs to image classification and interpretation, 3-D and stereo visualization, and so on. V5.50 brings all these miscellaneous processes together into a single major Surface Modeling process where they are more easily located, used, understood, maintained, and expanded. The Surface Modeling process is almost all new code, and thus uses the latest up-to-date TNT user interface components. All these operational changes cannot be detailed here. The input parameters required in several procedures have not been markedly changed to temporarily retain some continuity with the use of the older processes. Please experiment with the new process, and consult the supplemental documentation entitled Surface Modeling for more operational details. A black and white plate is attached also entitled Surface Modeling to illustrate a specific application of the new process. The following surface fitting and analysis processes have been integrated into the Surface Modeling process and will be deleted in V5.60: Surface Fitting (i.e. minimum curvature, profiling, Kriging, etc.); TIN to Vector (i.e. contouring); Raster to Vector (i.e. contouring); and Vector to TIN (i.e. triangulation) processes. The Extract Important Points process will also be moved to the Surface Fitting process in V5.60. As part of the conversion and reorganization of these old processes, a variety of new capabilities have already been added, and more are planned as this process evolves. These new features will be introduced below. V5.50 still contains all the older processes on the menus. All of these will be removed in the release of V5.60. Thus, any features in these separate, older processes which have been omitted from this new, single Surface Modeling process should be brought to the attention of MicroImages as soon as possible so that they will be available in V5.60. Miscellaneous New Features. More Input Object Types. The old surface fitting procedures were limited to operating on a vector object. Some accepted a TIN object as input. The Surface Modeling process now also accepts TIN and database objects as input to every surface fitting procedure. Element and value selection by query is now available for use with any surface fitting procedure on any type of input object (vector, TIN, and database). i.e. a query of a database object can select a subset of X-Y points and the field for the Z value and then fit a surface and save it as a raster object. Duplicate Points. The handling of duplicated position points, previously available only in the Kriging procedure, is now available in the other procedures. Duplicate point handling helps to resolve situations where a number of different input elements (points, lines, nodes) corresponds to the same pixel in output raster (depend on size/resolution of output raster). Querying a TIN. The procedures which produce a contour map in a vector object now accept either TIN or raster objects as input. When a TIN object is input, a query can be used for triangle selection and selecting the field containing the value to be used for the TIN nodes. Contours are only created for the selected triangles. Thus triangles which are not selected will produce holes and gaps in the contour vector object. An example will clarify the use of this feature. Computer generated DEMs (e.g. from RADAR interferometry) produce a slight variation in the elevation of the margins of lakes. This can produce a TIN of the general surface with the triangles inside a lake's margin which are almost flat but have a tiny slope not equal to zero. The attributes for the edges in a TIN include the identification of the adjacent triangles to be used in a query to test for adjacency. The area attributes contain a field for the slope of the area of the triangle. Thus, a query could be written which finds triangles with zero or very small slopes, checks to see if at least five occur together and exceed some total area, and then eliminates them from the contouring process. If this query correctly identifies the lakes as holes, their interiors will not be contoured. Faster Contours. When contouring a raster object, an input cell sampling rate can be specified. Use this option when interactively testing the production of a contour vector object from a huge raster object. Apply raster sampling to quickly generate a contour map without paying much attention to fine surface detail. Experiment to find the parameters that produce the desired contours. Then decrease the sampling rate or eliminate it for a final run. Think of this as something like using a wire frame for positional design in the 3-D process. Raster contouring with or without sampling is now much faster. In test situations, a 25 minute run in V5.40 became a one minute run in V5.50. This turns days into hours for those clients who are reporting that they are contouring huge rasters (e.g. the area of a whole country). Logarithmic Contours. A new feature is available to specify a logarithmic interval between contours. Apply this option to produce a contour object which involves large areas of relatively flat terrain and local areas of steep and high features. This kind of contour rendering is commonly used in atlases where mountain ranges make equal interval contours meaningless when they stack up in the steep, higher altitudes. Object Info. The new process uses tab panels. Input and Output panels are included to present the technical properties of the input and output objects (name, description, data type, number of elements, type of georeference, etc.) Exposing a panel will permit an immediate review of the objects' characteristics and avoid mistakes in setting up the processes' control parameters. These characteristics and information which the TNT products create, maintain, and update for a geodata object are the same as can also be reviewed in the Project File Maintenance process. They differ from metadata, much of which is manually entered. Obviously this same Descriptive panels concept needs to migrate into other TNT processes. DEM to TIN Conversion. Wavelet Point Extraction. The procedure used in the Surface Modeling process to convert a raster object such as a DEM into a TIN object uses a wavelet transformation approach. It extracts of a collection of points from the DEM that will form a TIN object that resembles the input raster surface in a "semi-optimal" way. Optional input parameters are available to control the level of detail required in the output TIN (i.e. the number of nodes, therefore the densification of the triangles). Further development of this method, which is also used in the 3-D process, is continuing in order to improve speed and accuracy control. The separate Extract Important Points process available elsewhere in TNTmips derives its points from the local maxima and minima of the DEM. This has a number of limitations, such as the presence of extreme points which are detected and treated like minimum or maximum cells. If the input raster surface is locally smooth (just few min/max type cells), the output from the Extract Important Points process will fail to describe it. V5.60 will allow the creation of a vector object containing important points by either method mentioned above and other methods and logic can be added as needed (e.g. create a vector object from the cells of maximum local slope and so on). TIN Tolerances. The following control parameters are available in all TIN based processes. Elevation Tolerance. When a TIN object is being computed in the wavelet optimization procedure, an optional elevation tolerance can be set. It specifies that maximum difference between two elevations in the input data (i.e. points) which will be treated as insignificant. This tolerance will delete nodes from the TIN object formed on the following basis. If each elevation difference between a given node and its connected neighbors is less than the specified elevation tolerance, then the given node will be removed and the TIN recomputed in this area. For example, every cell in the input raster, field in the database object, etc. have some measurement error. When an elevation tolerance is selected with regard to this error value, the surface will be the same when all such nodes are removed. The surface gains nothing from retaining these node in the TIN structure. Minimum Edge Length. DEMs with 'drawn' lines and distinct edges could produce anomalous 'sliver triangles' during the wavelet optimization process. For example, sharp lines produce a local set of 'close' points which could connect to a single distant point creating meaningless sliver triangles. Set the minimum possible triangle edge length to eliminate such sliver triangles in the TIN object. If the distance between two nodes is greater than this value, then the edge will be retained. Modifications Since V5.50 CDs. Maximum Edge Length. This is a similar option and objective to the Minimum Edge Length outlined above. All edges that have a length greater than this value will be deleted from the TIN object. As a result, the two triangles that contain this edge will both be deleted. This tolerance rule will create multiple isolated holes or hulls inside the TIN object. This approach can deal with multiple islands which will have a shape of the real shore line. Without this tolerance value set, islands can have long edged triangles which connect them to the nearby coast and coastal plain. Contouring Rasters. An improved and more robust raster contouring process has been added. Also, the input data from the raster can now be smoothed "on the fly" (i.e. as contoured). This option will create visually better contour lines, especially when using a poor quality, 8-bit integer DEM. An additional option has been added which will limit the length of a continuous contour. This will eliminate very small contours around a peak and similar artifacts. A collection of 16 new filters has been added. Most of these filters are those used to improve commonly available RADAR imagery. The raster spatial filtering process also has a new interface. Some of the older filters now have more control parameters so that they can be tuned according to their application. Only the 16 new filters added with V5.50 are introduced below. Supplemental printed documentation entitled Spatial Filter is also enclosed to describe these new filters in more detail. It also provides the reference used to prepare the specific new filter function. A black and white plate entitled Spatial Filters is also attached to illustrate the application of some of these new filters. Grouping Filters. All filters in this process (new and old) are now divided into six classes: General, Edge Detection, Enhancement, Noise Reduction, RADAR, Texture. This helps to select the desired filter type and simplifies locating them in the expanding set of filters. Filter classification is difficult because there are a number of filters that can belong to different classes. For example, the filters in the RADAR class are a specialized subset of Noise Reduction filters designed to suppress the specific speckle noise found in RADAR images. However, the more general Noise Reduction and Enhancement filters can also be useful in improving the appearance of RADAR images. General Filters. These filters include the classical Low Pass/Average, High Pass, and High Boost filters. No new filters were added to this group in V5.50. The High-pass and High-boost filters were redesigned in order to produce correct results for a filter window size greater than 3 x 3 cells. Enhancement Filters. This group contains new filters with a variety of characteristics. Some of them can perform edge sharpening and noise reduction at the same time (depending on filter parameters). The new filters added by this group into V5.50 are as follows. Volterra\Unsharp. The Volterra/Unsharp filter is an edge-enhancement filter in which the amount of enhancement is proportional to the local image brightness. Local Contrast. The Local Contrast filter performs a locally adaptive, spatially varying contrast enhancement. The amount of enhancement varies as a function of the existing contrast in the local neighborhood, expressed as the ratio of the global mean to the local standard deviation. (See also separate Wallis filter description below.) LUM (Lower-Upper-Middle). The LUM filter is a nonlinear edge-enhancement filter that simultaneously suppresses image noise. CS (Comparison and Selection). The CS filter is a simple nonlinear edge-enhancement filter that also suppresses image noise (outlier values). WMMR-MED (Weighted Majority with Minimum Range - MEDian). The WMMR-MED filter is a non-linear edge-enhancement filter that also suppresses image noise (outlier values). Noise Reduction Filters. Filters already available in this group were Median and Modal. The new filters added by this group into V5.50 are as follows. Olympic. The Olympic filter is useful for smoothing a noisy image by eliminating extreme values. The filter is named for the system of scoring used in certain Olympic events, in which the highest and lowest scores are dropped and the remaining ones averaged. MLM (Multi-Level Median). The MLM filter is designed to reduce image noise (outlier values) while preserving edges, corners, and thin line detail in the image. PM (P-Median). The P-Median filter is designed to suppress noise while preserving edge and line detail. AMPM (Adaptive Mean P-Median). The AMPM filter is a variant of the P-Median filter that is designed to provide better smoothing in uniform regions while still preserving edges and line detail. RADAR Filters Added. The new filters added by this group into V5.50 are as follows. Sigma. This is an improved version of "Sigma" filter where its parameters are adjustable. The Sigma filter can suppress speckle with minimal blurring of edges and fine detail. Frost. The Frost filter is an adaptive RADAR filter that smoothes homogeneous areas but preserves the signal estimate closer to the observed value of the center cell in heterogeneous areas. Lee. The Lee filter smoothes the most in uniform areas, while edges and other fine detail are maintained. Kuan (adaptive noise smoothing). The Kuan filter is similar to the Lee filter but makes fewer simplifying assumptions in the calculations. Texture Filters Added. Standard Deviation was the only previous filter now available in this group. New filters added in V5.50 are as follows. Teager. The Teager filter produces an image of the edges within the original scene. Range. The range filter produces an image of one of the simplest elements of texture, the local range of values. The Wallis filter was developed separately for V5.50 and occurs in its own menu location at (interpret/Raster/Filter/Wallis ...). Used properly, this is an important and useful filter as is obvious from the attached black and white plate entitled Wallis Filter, Locally Adaptive Contrast Enhancement. It works its way through an image to produce an image that has a lot more spatial detail available to the human eye in the darker and lighter areas. How Does It Work? The Wallis filter partitions a locally adaptive, spatially varying contrast enhancement. The brightness values in local areas are adjusted so that the local mean and standard deviation match desired output values as closely as possible. This enhancement produces good local contrast throughout the image, while reducing the overall contrast between bright and dark areas. The Wallis filter portions the input raster into contiguous square blocks of user defined size. The local mean and standard deviation are calculated for each block, and the resulting values are used to calculate a gain coefficient for the central cell in the block. Gain coefficients for other cells are calculated from the block values by bilinear interpolation. The image-partitioning and interpolation of gain coefficients employed by the Wallis algorithm result in rapid processing of the input raster, when compared to a cell-by-cell calculation of local statistics. The output raster values are a user-controlled weighted average of the Wallis filter output value and the original raster value. What is Its Application? Dark and light areas of images, especially digitally recorded images, often have a range of interesting spatial features at different data levels which are not easily seen. This may be due to the low dynamic range of the display media at these extremes (e.g. in a graytone print on a laser), the "fall-off" in sensitivity of the human eye, and many other factors. The Wallis and Local Contrast filter (described above) work through an image, detect areas of extreme contrast, and remap these local areas' cell values back into the middle ranges of the cell values in the total image. This provides a pleasing image that is more easily interpreted. However, the new raster produced now contains locally altered data values (as with all filters). This raster should not be subjected to automated computer interpretation (with the possible exception of Feature Mapping). Geologists often use these enhanced images for the increased visual interpretability they provide. They are not that concerned that surface features in one area of the image match in tone or color the same features occurring elsewhere in the image. After all, they are seeking geologic features already obscured by many other factors. In other words, it is detecting and mapping the features' outline that is of most interest Using these filters to operate on three separate images and then combining them in RGB can produce very strange results. Remember, the filter adjusts the tones in each image totally independently, and the overlap will be hard to interpret. However, a modified Wallis filter is in successful application by some geologic clients. It treats all three images at once and produces enhanced color results. MicroImages will be experimenting with this color enhancement filter for release in V5.60.
* Surface Properties. (prototype process) Introduction. A new process is available at Interpret/Vector/Surface Properties to compute three new tables of the surface properties of a raster object (e.g. a DEM or image). These tables contain the three dimensional properties of the line elements and the areas inside the polygon elements in an overlapping vector object. The vector and raster objects used in this process in this fashion do not need to have a common map projection, scale, or area of extent. Use this approach when a vector object is available (e.g. crop type map) and the surface properties (e.g. actual surface area) are needed for the lines and polygons in an entire vector object. The process also supports the direct drawing of a series of new polygons for which these three new tables can be immediately created. These new polygons are outlined sequentially with the standard polygon drawing tools. A new vector object is then created containing these polygons and the attached tables of surface properties. Use this method when a few, new polygons are to be created (e.g. specific individual crop fields) and these surface properties immediately computed and used. Reference layers can also be selected and displayed in the process. This allows the direct drawing method to be used in a direct photointerpretation mode. For example, first choose an elevation raster to define the surface layer. The Crow Butte quad sample elevation raster can be used for a trial run. Then choose a reference layer which is an image. The color-infrared rendition of the LANDSAT TM imagery of Crow Butte quad will work. This reference image will replace the elevation surface in the view window, however, the surface properties will be computed from the designated elevation surface which is no longer visible. While viewing the reference image, use the polygon tool to outline several polygons (e.g. crop fields). Finally, choose run to compute the surface properties tables. The process will prompt for the name, location, and description of the new vector object for these polygons and for the name and description of the new tables. Computational Methods. Volumes are computed using a piecewise integral under the portions of the surface above and below a specified horizontal reference plane. During this process, the cells inside the 3-D trace and the cells making up the trace are evaluated. A square horizontal cell on the surface is projected to a square cell on the horizontal plane passing through the selected reference value. Connect the corners of these two square cells to define a square orthogonal prism. Every surface cell inside the 3-D trace and the 3-D trace cells will define such a prism. If a prism is above the reference plane, its volume is positive and if below it is negative. The various volume properties are the sums of these negative and positive prism volumes. Many other volumes can be computed by combining the results of various Z-Level settings and the three volume values provided in each table and defined below. Vector polygons and line elements may or may not have Z values associated with them, depending upon their source and previous use. This process does not use the Z values for the elements even if they are available. Each vector polygon or line element is projected vertically onto the surface raster to obtain a 3-D trace. This 3-D trace is a continuous string of raster cells defining a closed raster polygon or a line. Each vector polygon or line element is made up of a string of vertices connected by straight line segments. Any, and only raster cells which are intersected by the line segments between vertices and those cells actually containing the vertices define a 3-D trace. The X,Y, and Z values of the cells in a 3-D trace are used to define the line, area, and volume properties. The individual cells making up the 3-D trace are also included within the area and volume properties. A black and white plate is enclosed entitled Surface Properties to illustrate the interface of the process and the contents of the three new tables it creates. Supplemental documentation is also enclosed entitled Surface Properties. Line Properties Table. The following are the fields which are created in the lines surface properties table together with the methodology used to compute the values in them. Max. This is the cell value of the maximum or highest value surface cell in the 3-D line trace. Min. This is the cell value of the minimum or lowest value surface cell included in the 3-D line trace. MaxSlope. This is the maximum slope found between the cell values of two adjacent cells in the 3-D trace of the line on the surface. This is a very "local" property of the 3-D trace and is thus significantly impacted by "noise" (i.e. cell to cell inaccuracies) in the surface raster. MinBoundSlope. This is the minimum slope found between two adjacent cells in the 3-D trace of the line on the surface. This is a very "local" property of the 3-D trace and is thus significantly impacted by "noise" (i.e. cell to cell inaccuracies) in the surface raster. Length. This is length of the 3-D line trace on the surface. It is computed as the sum of the lengths of the 3-D line segments connecting the centers of the cells making up the 3-D trace. It is always greater than, or nearly equal to, the length of the vector line which is available in the standard attributes table. Even for a horizontal surface, these two lengths will not exactly match, due to the different methods used to compute them. Polygon Properties Table. The following are the fields which are created in the polygon surface properties table together with the methodology used to compute the values in them. Z-Level. This is the vertical offset which was entered in the Surface Properties dialog box. It specifies a horizontal reference plane at a positive or negative offset from the 0.0 value in the surface raster object. The positive and negative volumes properties of the surface are measured relative to this plane as described below. Area. This is the area of the surface inside the actual 3-D trace of the polygon on the surface. The bilinear interpolation is used to model the surface to compute this area. It is always greater than, or nearly equal to, the projected area of the vector polygon which is available in the standard attributes table. Even for a horizontal surface inside the polygon, these two areas will not exactly match due to the different methods used to compute them. MaxZ. This is the cell value of the maximum or highest value surface cell included within (and including) the cells in the 3-D polygon trace. It is measured positive or negative relative to 0.0 in the raster object used for the surface. This is the same value as Max in the table of raster surface properties. MinZ. This is the cell value of the minimum or lowest value surface cell included within (and including) the cells in the 3-D polygon trace. It is measured positive or negative relative to 0.0 in the raster object used for the surface. This is the same value as Min in the table of raster surface properties. BoundMaxZ. This is the maximum cell value found for the cells in the 3-D trace of the polygon on the surface. It is measured positive or negative relative to 0.0 in the raster object used for the surface. BoundMinZ. This is the minimum cell value found for the cells in the 3-D trace of the polygon on the surface. It is measured positive or negative relative to 0.0 in the raster object used for the surface. MaxBoundSlope. This is the maximum slope found between the cell values of two adjacent cells in the 3-D trace of the polygon on the surface. This is a very "local" property of the 3-D trace and is thus significantly impacted by "noise" (i.e. cell to cell inaccuracies) in the surface raster. MinBoundSlope. This is the minimum slope found between two adjacent cells in the 3-D trace of the polygon on the surface. This is a very "local" property of the 3-D trace and is thus significantly impacted by "noise" (i.e. cell to cell inaccuracies) in the surface raster. BoundLength. This is length of perimeter of the 3-D trace. It is computed as the sum of the lengths of the 3-D line segments connecting the centers of the cells making up the 3-D trace. It is always greater than, or nearly equal to, the projected perimeter of the vector polygon which is available in the standard attributes table. Even for a horizontal surface inside the polygon, these two lengths will not exactly match due to the different methods used to compute them. VolumeZmin. This is the piecewise integral of the volume under the surface and above the minimum for the surface cells inside or making up the 3-D trace. The cells making up the 3-D trace are included in the volume. In other words, this is the sum of all the square orthogonal prisms above the minimum 3-D trace cell. All these prisms are positive except the prism for the Min cell which has a zero volume. VolumePos. This is the piecewise integral of the volume under the surface and above the Z-Level reference plane. It sums only the positive values for the surface cells inside of and also making up the 3-D trace. In other words, this is the sum of all the positive prisms above the Z-Level plane and inside or making up the 3-D trace. VolumeNeg. This is the piecewise integral of the volume outside the surface and below the Z-Level reference plane, but inside or making up the 3-D trace. In other words, this is the sum of all the negative prisms below the Z-Level plane and inside or making up the 3-D trace. Sample Applications. An example might serve to indicate the use of these volume properties. Suppose that the volume of a portion of a mine spoil pile (i.e. a hill) is required above a specified elevation of 110 meters. A digital elevation raster is available of the area including the spoil pile which has a maximum height of 120 meters (Max). An X-Y polygon can be traced around the base of the pile and is at about 100 meters (BoundMin and BoundMax are 89 an 91 meters). The volume of the pile above 110 meters will be found in the VolumePos when the Z-Level has been set to 110 meters. Should any interior irregular portion (i.e. a hole) of the pile dip below 110 meters, it will not be included in VolumePos. Next assume that a nearby open pit mine created part of this spoil. How much will the volume above 110 meters contribute to filling in this pit? Trace an X-Y polygon around the pit which also varies from 88 to 92 meters. Set the Z-level at 88 and look at the VolumeNeg value required to fill to that level. Or set Z-level at 90 and get a more accurate value to grade the surface back to the 88 to 92 meter average elevation. Now get more complicated by computing two surface properties tables for areas of polygons between to two elevations. Set each elevation desired as a Z-Level in two as a separate run. When this is complete, use the "computed field" procedures to prepare new fields which contain the appropriate algebraic sums of the corresponding volume properties in each table. Raster Properties Table. The following are the fields which are created in the raster statistical
properties table together with the methodology used to compute the values in
them. IMPORTANT: Any raster can be used to define a "surface".
Selecting an image as a surface will create this raster properties tables from
vector polygons. Selecting several images from a multispectral or multitemporal
set will create a suite of matching raster properties tables. These tables can
then be exported and merged in a separate database package, moved into a
spreadsheet, and so on for further modeling and analysis. Max. This is the cell value of the maximum or highest value raster cell included within (and including) the cells in the 3-D polygon trace. This is the same value as MaxZ in the table of polygon surface properties. Min. This is the cell value of the minimum or lowest value raster cell included within (and including) the cells in the 3-D polygon trace. This is the same value as MinZ in the table of polygon surface properties. Mean. This is the mean of all the raster cells included within (and including) the cells in the 3-D polygon trace. Mode. This is the mode of all the raster cells included within (and including) the cells in the 3-D polygon trace. Median. This is the median of all the raster cells included within (and including) the cells in the 3-D polygon trace. StdDev. This is the standard deviation of all the raster cells included within (and including) the cells in the 3-D polygon trace. CellCount. This is a numeric count of all the raster cells included within (and including) the cells in the 3-D polygon trace. SML. The list of the functions in the Spatial Manipulation Language (SML) and their application references have been updated. A new 74 page Appendix A: SML Language Reference is enclosed as supplemental documentation. There is now a set of date manipulations functions.
In database query, you can now determine the number of records attached to a given element using the form SetNum(Table[*]) This will return the number of things in the set of all records from "Table" attached to the current element. The function DispWaitForButtonPress( ) has changed slightly. It takes a parameter which is the ID of the display window. This parameter is now optional. If omitted, the SML script will wait for a button press in any open display window. Modifications Since V5.50 CDs. The SML restructuring will be well underway. Anyone wishing to create more elaborate scripts can offer to test these changes. The first goal will be to let an existing script be selected and executed anywhere a layer can be selected. For example, a script can be selected as a raster layer in the display process. This script will then be displayed, executed, and its resulting image displayed just as if it had been stored in a raster object. * Improved Use of Windows Drivers. When using a Windows printer driver, it is now possible to let the driver do the dithering and color matching. After selecting the Windows printer and settings, a dialog will appear asking if you want to let Windows handle the dithering and color matching. Doing so can greatly improve the quality for some printers. All TNT products now communicate 24-bit color data to Microsoft Windows printer drivers. V5.40 used 4-bit dithered color which resulted in poor color fidelity. This modification will significantly improve screen to printer renditions. However, it is still an industry problem to automatically create a good rendition of the color image on a screen to a color printer. Epson Stylus Pro. The regular TNT direct driver for the Epson Color Stylus models of printers was modified to put less ink on the page to avoid bleed-through. There is now a new and second driver which can be selected named 'Epson Color Stylus (Lighter)'. With these printers, the "Cluster2" dither pattern should be used for prints of vector features. For prints of rasters, use the 'Halftone1' dither pattern. * Free Base Level Printing. Color printing using the Map and Poster Layout process is now available without charge for printers previously covered by printer support feature levels P3 and P5. As a result, color snapshot printing and full Map and Poster Layout printing capabilities are now available in the TNTmips, TNTview, TNTatlas, and TNTlite products. This printing without extra charge on 8.5 by 11" printers applies to all color printers of the three or four color "fixed dot size" variety. Higher level printer support options (greater than level P5) must still be purchased for variable dot sized or sublimation based printers of any size. Modifications Since V5.50 CDs. The 'Epson Color Stylus' and 'Epson Color Stylus (Lighter)' drivers in V5.50 require choosing between a good dither pattern for vectors and another for rasters. The 'Cluster2' dither pattern has been modified so that it now works well for mixed prints of rasters and vectors. It can be used with either driver. This improved dither pattern has been tested in this printer with regular paper and now does not 'bleed through' the paper. It has also been tested with the special Epson polyester based paper, and the ink now dries before it can puddle or run. The HyperIndex navigator window used in TNTatlas has been improved as follows. A page of supplemental documentation entitled HyperIndex Navigator Update has been enclosed to describe its use. The attached color plate also entitled HyperIndex Navigator Update illustrates the new approach. The directional arrow icons in the Navigator window now take you immediately to the closest link in the specified direction. The arrow icons now have ToolTips which indicate what object is in the specified direction. Previously these ToolTips gave useless generic descriptions like "Left". Clicking the right mouse button on the arrow icon pops up a menu of all links to objects in the specified direction, sorted in nearest to farthest order. Selecting an object off this menu will jump you directly over to that object. The scrolled list to the right of the buttons is no longer needed and has been removed. This makes the HyperIndex Navigator window much smaller and easier to keep exposed. There is now a "Print" icon button which prints a snapshot of the current view. A "Metadata" button has been added to the layer list to allow metadata to be viewed for a selected layer if available. MicroImages Advanced Users' Workshop (AUW9) By popular request from clients and MicroImages' staff, the 9th annual Advanced Users' Workshop (AUW9) will be delayed until late spring or early summer in Lincoln. The cold weather, proximity to the holiday season, complications of winter travel, lack of adjunct warm-weather vacation opportunities, and related difficulties are behind this change. More information on the exact dates and details for AUW9 will be forwarded at an appropriate future date. Reproduced from October 1996, page 11 of Peaison Professional (Singapore) Ptn Ltd, GIS Asia Pacific, 159 Taiok Ayer Street, Singapore 068614. "First-ever Japanese TNTmips User Workshop. Japanese users of MicroImages' TNTmips software met on 9 September 1996 at the Geological Survey of Japan (GSJ) for a workshop session. The event, sponsored by the local TNTmips distributor [dealer] OpenGIS Inc. (Tokyo Japan), drew 30 participants. The theme of the workshop was raster and vector data conversion and integration. A series of presentations were given by members of the Geological Survey, a major user of TNTmips, and by Toshihiko Waza, president of OpenGIS Inc." "Waza, a vocal advocate for the fair pricing of computer software in the Japanese market, summed up the meeting thus: 'The fast and steady growth of the MIPS user base in Japan over the past two years shows that GIS and remote sensing clients in Japan desire to have functional and professional solutions at a fair and reasonable price, unlike many of the imported GIS and remote sensing software currently being distributed here. Given the rapid expansion, over the last few years, of the DOS/V PC market base, I expect continued growth and interest in desktop GIS and remote sensing market here in Japan." [MicroImages greatly appreciates Mr. Waza's vocal support of our policy that 'a fair price for all is good for all'.] MicroImages Authorized Dealers New dealers sought--no investment required for existing clients who believe in the TNT products and the way in which MI conducts its business. Many of you are already our best sales people. Tom Gibson has recently left Kennecott Corporation (subsidiary of RTZ International) to form his own consulting group specializing in applications of GIS and Remote Sensing in the mining industry. He has considerable experience in this area, having acted as Kennecott's reference person (guru) for all their RTZ North American offices for several years. Tom has used and promoted the TNTmips products during these years and is known to many of our older clients from previous Advanced Users' Workshops. Among its varied consulting services, Gibson and Associates will be offering design, sale, and training for the TNT products as a MicroImages Authorized Dealer. Tom and Gibson and Associates can be reached at 950 East 8475 South, Sandy, Utah 84094; voice at (801)569-9059; and FAX at (801)569-8387. VIDAR (Inc.) has decided to more closely focus its assets and activities in the medical publishing area due to current economic realities in Russia. As a result, VIDAR will not join the MicroImages Authorized Dealer program. MicroImages wishes to thank Leonid Zharikov for his company's previous efforts on behalf of our TNT products. Until a new dealer(s) is selected in Russia, all current and potential MicroImages clients' sales and support will be provided directly by MicroImages from Lincoln, NE. Such communications can be in Russian and will be answered by one of our three native Russian software engineers. New Image Printers/Plotters Supported Epson Stylus Pro XI. A driver is now available for the Epson Stylus Pro XL. It can print in color at 180 by 180 dpi, 360 by 360 dpi, and 720 by 720 dpi. The standard model has a paper size of 8.5 by 11". A Super B model is available with paper sizes of 13 by 19" (B) and 11 by 17" (A4). These printers can print on a variety of materials including transparencies, glossy paper, coated paper, plain, and bond. Their big drawback is as quoted directly from the Epson product specifications: "Full size 720 x 720 dpi color photographic image [for 8.5 by 11"] typically prints in about 50 minutes." HP Deskjet 820. A special note was provided by HP in printer driver development materials distributed for the Deskjet 820 printer. "This is the first HP DeskJet printer to be designed strictly for printing from Microsoft Windows with the HP-made DeskJet 820 printer driver. No facility exists within the DeskJet 820 printer to process and respond to PCL commands. HP has coined the term 'PPA' for this type of printer, meaning 'Printing Performance Architecture'. The HP DeskJet 820 printer driver for Windows is required for using this printer. Since the printer does not accept PCL commands, the HP DeskJet 820 printer has been excluded from the discussion in this developers guide." This quarter, one Gateway 'best for your
money' computer is recommended if you are ready
to upgrade your hardware now. The recommended machine is the same as the most
recent unit which MicroImages has purchased. Yes, you can now afford and should
buy a Pentium Pro (P6 processor) for TNTmips. The following is
quoted to support this recommendation from page 155 of PC
Magazine, Vol. 13, No. 16. 24 September 1996. It matches our experiences
with this unit as well. "Pentium Pro Beats Classic at 32 Bits. Unquestionably, under a
32-bit operating system such as Windows NT 3.51 with only 32-bit
applications, the Pentium Pro's performance
soundly trounces the Pentium's, even when both
are running at 200 MHz. Using our benchmark test ZD Winstone 32, we found
that the performance of Pentium/200 we've tested
increased by about 35 percent. And with only a 10 percent price gap between the
two chips, you get your money's worth out of the
Pentium Pro." Remember that all the TNT products have been fully 32-bit (and 64-bit for DEC Alpha via UNIX) compliant for years. Therefore, they will take full advantage of the 32-bit execution optimization features designed into the Pentium Pro processor. The slower aspects of others' 16-bit programs affect only those programs and not the TNT products. W95 also contains quite a bit of 16-bit code and runs about the same on a Pentium Pro. However, NT3.5x and NT4 use only 32-bit code and will also benefit significantly from the 32-bit optimizations built into the Pentium Pro platforms.
Sony and Toshiba have both introduced excellent multi-media oriented and fancy PCs using the Pentium 200 MHz (P5) processor. Both have recently received top ratings from reviews including PC Magazine's test lab. Either would make excellent systems for TNTmips. Both are configured similar to the Gateway Pentium Pro outlined above (hard drive, CD, etc.) but at present cost $2700+ each without the monitor. GIS World review. A color reprint of the article entitled Image Processing Under Windows NT, A Comparative Review is enclosed. As the first page of this reprint, there has been appended a color cover letter entitled TNTmips is #1 at Only 1/5 the Cost! This reprint is now included in each detailed promotional package mailed from MicroImages. The article and the added cover letter speak for themselves. Also enclosed are the excuses letters to the authors of the article from the top management at Intergraph, ERDAS, and PCI. I did not find it necessary to similarly attack these authors as the products and policies of MicroImages speak well enough for MicroImages! There is now an interesting anecdote to this article. The authors of the article purchased and use a UNIX based TNTmips system. To the best of our knowledge, this is the only one of the systems reviewed which has been purchased by these authors and is at this institution. > Getting Started Displaying Geospatial Data. By popular request from beginners, each TNTlite kit and professional product shipped now contains this small tutorial manualette. It has been prepared in color, but is currently copied in black and white. Color printing may be a future possibility, but it is currently uneconomical. TNTmips and TNTview are dynamic and evolving products, and this manualette will probably need to be changed in most quarters. The color version of this manualette with larger graphics can be found at microimages.com. This manualette turned out so well that the preparation of several more on various topics is underway by MicroImages' scientific writers. Watch www.microimages.com for others which become available before the shipment of V5.60. This activity will shift some effort away from the maintenance of that huge reference manual. But, we feel that using a portion of the available resources to help clients "Get Started" in various topics is in order. Once a start up tutorial has been used, the huge on-line reference manual becomes more useful. All the data sets used in this and the other planned manualettes will be available to use in working through the tutorial. Several such sample data sets are already provided on the CDs for the V5.50 TNT products. All of the objects used in the Getting Started illustrations will be small enough to use in TNTlite. This not only makes these manualettes useful for TNTlite self-instruction, but makes their use in repeating the exercise fast, even on slower systems. Archaeology promotional flier. A color test flier has been prepared and is enclosed to introduce the FREE TNTlite to the archaeological student and professional. This flier has just been mailed to 1000 archaeological professionals. Precision agricultural flier. A color test flier has been prepared and is enclosed to introduce the FREE TNTlite to the agriculture industry. This flier has not yet been used. Its black and white predecessor, which was sent with V5.40, has been used at several agricultural shows and mailings. ABCs of Image Analysis by TNTlite flier. Dr. Jack Paris and Bob Wright have done a lot of additional work perfecting the content of their text entitled The ABCs of Image Analysis using TNTlite (formerly The ABCs of TNTmips). University bookstores are now being approached by Bob Wright to stock this ABC kit (Text and CD) where an instructor is using TNTlite in teaching a remote sensing class. Since this text is so useful to students and professionals alike, MicroImages has invested in the preparation of the enclosed promotional flier and order coupon. This coupon will be included by MicroImages in each TNTlite kit and with each TNTlite CD promotional flier shipped. Paris and Associates and Atterbury Consultants, Inc. (both MicroImages dealers) will also be conducting their own independent promotional campaigns with this flier. Preliminary Recipes flier. Dr. Jack Paris also has another tutorial product available entitled TNTlite Recipes. A preliminary black and white flier describing this new material is enclosed. Please contact Dr. Jack Paris for this product. TNT Products CDs. The CDs for the TNT products now have that new, colorful Mars/Earth look. Vector Analysis Chart. A 2-sided reference chart is included entitled TNTmips Vector Analysis Operations. It provides a colorful synopsis of some of the GIS Object-In/Object-Out procedures in TNTmips as well as where you find them. Equivalent ARC/INFO process names are also included in parentheses on this chart. Thus, it can be provided to others familiar with ARC/INFO operation but who are still not convinced that TNTmips is at least at parity with this product in the GIS area. This chart is now included in each detailed promotional package and TNTlite kit shipped from MicroImages. Anaglyph Glasses. Keep these cheap cardboard monochrome glasses handy. Tape them to the side of the monitor so they don't get lost. More and more stereo features will be appearing in the TNTmips products. They may be the only way to judge the value of the additional equipment for electronic stereo viewing. EASI Database CD. A simple promotional piece is enclosed to describe the EASI Database CD of United States demographic data. This CD is available from its publisher for $99. Attached to this MEMO is an example of how the database tables on this CD can be directly used in TNTmips, TNTview, and TNTlite. New sample Project Files provided in V5.50 contain county and state boundaries. Attach the EASI Database tables to these vector polygons and immediately start making theme maps, pin maps, etc. of the United States or an individual state. The illustrations of the use of the EASI tables were made with these political boundary vector objects. The tables do not even need to be copied from the EASI CD and can be used in place by linking directly to them using the normal procedures summarized on these promotional sheets. The EASI Database CD is very useful in GIS self-teaching or in academic and training programs. It provides a large collection of United States statistics in a form which can be immediately used. For teaching purposes, one EASI Database CD can be shared so that students learn how to deal with large and meaningful collections of attributes. The TNT products no longer check the key for the use of printers which are in the P3 and P5 printer support feature categories. Thus, all TNT products can use the Map and Poster Layout process to print in color or black and white on low cost color printers up to 8.5 by 11" which use fixed dot dithering to reproduce their colors. Prices for the support of color printers which fall in the categories P8, P10, P15, and P20 remain unchanged. Sublimation, variable dot size, and printers which use techniques other than fixed dot dithering to create continuous color images are in these P8 and greater categories and are thus not supported in the free P3 and P5 categories. Printer support P8 is still required to create large files to take to a service bureau. MicroImages will soon change the policy regarding the cost of upgrades. We anticipate that the costs for annual subscriptions will not change. But, it will soon only be possible to upgrade TNT products on an annual basis, that is, for four quarters. Thus, it will no longer be possible to purchase single quarter upgrades on a quarter by quarter basis for a slightly higher cost per quarter. The cost of upgrading a TNT product will also begin to rise proportionally to the age of the TNT product. In other words, if the last upgrade of the product was V4.00, then it will cost more to upgrade from it than from V4.10 and so on. This will result in significant increases in the cost of upgrading old TNT products between V4.00 and V5.00 to the current version. For example, to upgrade from V4.00 to V5.60 may cost between two and three times the current rate. We suggest that anyone considering upgrading to a V5.50 from an old TNTmips contact MicroImages before this price increase goes into effect. MicroImages will soon announce the availability of multiple simultaneous user licenses for use with systems operating Microsoft NT. However, we anticipate that the current prices of the single user licenses for the TNT products for Microsoft NT will not increase. Books. Image Interpretation in Geology (2nd ed.). S.A. Drury. 1993. Chapman and Hall. ISBN 0 412 48880 9 This book is an excellent reference and text book for those interested in learning about or teaching the application of geo-imagery in geology. It can be obtained from:
GIS World. Image Processing Under Windows NT, A Comparative Review. by Lee A. Graham and Charles Gallion. GIS World, September 1996, Vol. 9, No. 9, pages 36 to 44. PC Magazine. Life Beyond Windows 95: NT4.0 & OS/2 4.0 Head for Your Desktop. by Michael J. Miller. PC Magazine, Vol. 15, No. 16. 24 September 1996. pages 101 to 111. Windows NT 4.0. It's prettier, but there are more subtle changes that may affect performance and reliability. by Jeff Promsise. PC Magazine. Vol. 15, No. 16. 24 September 1996. pages 117 to 136. 17-Inch Monitors Widen Your Horizons. A 17-inch monitor can make you more productive, and these 49 displays are more affordable than ever. by Jamie M. Bsales. PC Magazine. Vol. 13, No. 17. 8 October, 1996. pages 243 - 279. Christopher Nelson. Chris has joined MicroImages as a graphics specialist. Chris and Lance Dyas are now jointly responsible for the "public image" of MicroImages via icons, graphics, illustrations, logos, promotional materials, ... Chris has an Associate's Degree in Graphic Design from Southeast Community College near Lincoln, Nebraska, and is a part time senior in the Department of Fine Arts at the University of Nebraska. It is easy to illustrate what Chris has been doing in his first couple months at MicroImages. He created the graphics for the Archaeological promotional flier, the cover letter for the GIS World reprint, and the book illustrations for the flier for the ABC's kit enclosed with the V5.50 shipment. Jeffrey L. Chester. Jeff has joined MicroImages' software support team. Jeff completed his BS degree in History in 1985 at the University of Nebraska at Omaha. For the 11 years since graduation, Jeff was in the U.S. Army, leaving recently at the rank of Captain. His most recent assignment was teaching in the Reserve Officer Training Corps (ROTC) at the South Dakota School of Mines in Rapid City, South Dakota. Jeff's earlier military duties were focused upon leadership, management, and training activities in connection with various technical systems including computers. Dmitry (Greg) Kochergin. Greg has joined MicroImages' software engineering team from Moscow. Greg graduated in 1988 from the Department of Applied Astronautics of the Moscow Institute of Geodesy, Aerial Survey, and Cartography with the equivalent of a M.S. degree. (Same year and school as Dmitry Frolov, who is also on MicroImages' software engineering team.) After graduation and commissioning as a lieutenant, Greg was employed three years at Priroda, the same organization which is one of those currently distributing scanned Russian satellite imagery. Four years ago, he and Dmitry Frolov formed the subgroup at VIDAR, Ltd. in Moscow which used and marketed MicroImages' products. During that period, Greg worked in part at promoting TNTmips in Russian, translating manuals, desktop publishing, and related software activities. Recently VIDAR has become more narrowly oriented toward medical software and publishing. As a result, Greg has been "imported" to create MicroImages' software products. His initial software responsibilities will be to create several miscellaneous raster utility capabilities which will be added to existing processes. David L. Wilson. Dave will initially join MicroImages' software engineering and support teams. Dave graduated from the University of Nebraska in 1987 with a B.S. degree in Electrical Engineering and subsequently completed nine hours toward a M.S. in Electrical Engineering specializing in image processing under Prof. Robert Li (also former MicroImages staff member). From 1989 to 1992, he was employed by Allied Signal Aerospace in Kansas City as an Electrical Engineer responsible for providing PC software support (AutoCAD, SPSS, databases, ...) to their Components Engineering Group. From 1992 to 1995, Dave was a co-owner of Imagik Photo Laboratory near Kansas City, Missouri. In this company he designed, assembled, and maintained PC based image processing stations and used them to provide custom photo restorations and related services. Dave's unusual split-team appointment at MicroImages reflects the initial tasks which he has been employed to complete. He will immediately provide the simple descriptive and parameter descriptions for those functions in the TNTsdk (thousands of C functions in the TNT software development kit) that do not have them. Also sample source code snippets will be prepared as time allows to serve as examples to those clients using the TNTsdk. Similarly, Dave will also update the written SML (Spatial Manipulation Language) material available to all clients. He can then continue on to provide the new reference materials needed to support the significant expansion of the SML planned for V5.60. Dr. Randall Smith. Randy joined MicroImages in July as forecast and has settled in and been busily updating the on-line documentation (see previous MEMO for his background). Students. The three students employed to assist in the TNTlite mass mailings over the summer holiday have now all returned to school. They did a good job on this boring work. The students working part time on inputting material for the MicroImages web pages are continuing on this task. The following "special" support request call occurred on 21 October 1996. MicroImages knows that Thanksgiving is near with a call like this. I wonder how this person would have navigated the typical electronic answer/query system to the answer needed? [MicroImages] "MicroImages, this is Justine." [Caller] "Hello, this is [a name]. I'm trying to defrost an 11 pound turkey and I would like to do it in my microwave. I was wondering if this will hurt my microwave?" [MicroImages] "I don't think it will hurt your microwave, but I think you have the wrong number." [Caller] "Oh? Who did I get?" [MicroImages] "This is a computer software company. But, I do know that the safest way to defrost a turkey is in the refrigerator, but this will take a few days." [Caller] "Well, I'm not in any hurry, we aren't having it for dinner tonight or anything. But that must take a long time." [MicroImages] "It does take a few days, depending on how big your turkey is. Another way you can do it is to put it in a sink of cold water." [Caller] "I think I'll do that. I was worried about micro-organisms and all that." [MicroImages] "Yes, defrosting it in the microwave will get the outside thawed a lot quicker that the inside." [Caller] "Thank you for your help. What do you do there?" [MicroImages] "I do the books; payroll, invoicing, that sort of thing." [Caller] "I don't understand." [MicroImages] "Bookkeeping." [Caller] "Oh, I see. Well I appreciate your helping me. I was trying to reach a microwave store. Thank you very much." [MicroImages] "You're welcome." Those who are, or have become, seriously involved in the use of geospatial analysis need to know what is being provided or not provided by various products. As a serious user of TNTmips, you know its scope, what it does and does not do, and how it is done. Clients using other productshave similar experience with their realities as well. However, if you only use TNTmips (or have not yet made a choice), you might wish to plod through the Grapevine MEMOs. Unfortunately, promotional materials and advertisements provide only lists of what a product can do. These materials do not equate to what you find during actual use of the product. The Grapevine provides the real scoop on some of the problems and shortcomings of these other products. Its contents also provide useful technical reference materials on what GIS is and how its practitioners go about applying it. A year's worth of Grapevine MEMOs is now available totaling hundreds of pages. If the Grapevine is useful to clients and dealers, it might be feasible to take this information one step further. Several derivative products might be useful: 1) An index to topics (but the material would still spread over multiple issues). 2) A reorganization of the important quotes into topic arrangements (but this would mean its expensive republication). 3) After topical reorganization, the original information could be paraphrased into a problem list--something like the enclosed 55 Commandments MEMO prepared by others--but without the attempt at humor. Please provide information on these ideas and the usefulness of the Grapevine MEMO. It is a lot of work each quarter to read all the material on the list servers and other sources and select and prepare the small selection actually used. Correction and Apology to Earth Resource Mapping MicroImages would like to completely retract and publicly apologize for a
number of inaccuracies that were reported by MicroImages recently. In
particular, we would like to public state our acknowledgment that: First, it is well to note that any declaration of software availability such as that above 6 to 12 months from ERMapper needs to be realistically multiplied by a minimum of two. I speak from a lot of years of experience of having been promised such times both internally, with the products of others I buy, as well as from giving honest current estimates and promises to our clients which cannot subsequently be kept. Desktop, interactive, accurate, simple to use DEM/ortho processes have not been trivial to create in TNTmips over the past couple of years. It is thus to be expected that they will not be run-up overnight by anyone else starting from scratch, even with photogrammetric expert(s) to provide the design and source code. It is worthwhile to review the situation in the general desktop soft photogrammetry market place. The large complex, production oriented soft photogrammetry products are special purpose, very expensive, require a big computer capacity, need a photogrammetrist to manage, and employ highly trained operators. Intergraph's solutions fall into this general category. PCI. PCI's OrthoEngineô is a separate product. Read one of their many advertisements carefully. It will not indicate that the product produces a DEM and is therefore not a soft photogrammetry product. However, producing an accurate DEM is the real goal for most uses of soft photogrammetry products. Creating the accurate DEM is most of the complex coding work and is the key to producing ortho images from a wide variety of source images. We in the United States and Canada are well endowed with increasingly accurate public and privately available DEMs. Those in other nations are not! Thus, creating a suitable DEM allows a geospatial system to create many other useful digital layers and associated hardcopy products: ortho images; drainages and watersheds; viewsheds; perspective and stereo views (V5.50); surface and volume attributes (V5.50); fly-bys and drive-thrus (V5.60), surface routing and related path analysis (coming), and many other terrain and physiographic analyses. ERDAS. ERDAS originally marketed a soft photogrammetry module with code they adapted internally based upon source code licensed from staff at the Geodetic Engineering Department at Ohio State University. Subsequently this approach did not work out for them. Now they market and support an expensive separate OrthoMax product they have licensed from AutoMetrics. AutoMetrics also markets the product separately so ERDAS's license is not exclusive. This product was developed originally under high cost U.S. military contracts. ERMapper. ERMapper has had a good source of theory and code materials and concepts directly from Australia's excellent CSIRO labs. In fact, it appears from the above, that they have had a design available from this source since they declared a start in this direction over 18 months ago (at about the same time they were mailing out all the CD coasters). But, as they are no doubt discovering, and was also found by ERDAS, using software designed by, and for use of its creating experts, is not so easy! What is Needed? The inexperienced or casual user will not likely be rigorous or tolerant in their use of a soft photogrammetry process. Photogrammetry is based upon precision measurements. Most users of desktop IPS systems usually have no background in the concept of the very accurate and appropriate inputs which are so critical to achieving accurate results with these semi-professional systems. This is not a negative reflection upon the user. If you want to support the "casual" user of soft photogrammetry, then it is the responsibility of the software creator to take the "photogrammetry" out of the process as far as this user is concerned. A recent example in using georeferencing in TNTmips serves to illustrate this. MicroImages was supplied with an image and its georeference data from a site where its georeferencing was not working well. The client had started with three points which fell close to being on a straight line. This did not define a plane in space and incorrectly influenced the solution for the subsequent points entered. This problem was not reflected in the way the process provides residuals, so this problem with the peculiar arrangement of the three points was not obvious. Proper, accurate distribution of georeference, tie, and control points are critical. Most soft photogrammetry processes will crash and burn with input control points which are just a little inaccurate or biased in their distribution. Residuals obviously do not help that much. MicroImages is searching for ways to provide other feedback with regard to the value (i.e. position), accuracy, and impact of the collection of points which have been entered. TIN densification released in V5.40 was a nice step in this direction, as it minimizes user input and can compensate for some incorrectly input tie points. Multiple hidden "traps" were added into the georeference process this quarter to try to check and inform clients about the suitability of their current georeferencing model. TNTmips. So where is the TNT ortho/DEM process in all of this? First, it is not a separate product, but an integral part of every TNTmips product. The code used is entirely that of MicroImages and was not adapted from anywhere else. As a result, it has the same general interactive interface and look and feel as other TNT processes. Since the process is totally MicroImages' in design and implementation, we are learning and accumulating internal knowledge about what is needed, and using it to gradually improve the process. This process should receive considerable additional attention in the next quarter. Fortunately, the effort already expended and to be expended is providing serendipitous results in other processes. MicroImages spends a lot of time thinking about how software users act and react so we can try to make things easier to use. Because of their tight integration with other processes, the DEM/ortho development activity and the functions created for it are beginning to cause improvements in other processes. For example, the need for fast TIN formation and especially for re-formation has provided very fast functions for this purpose which were then used in the interactive editing of TIN objects, for faster and better surface fitting methods, for better 3-D perspective, and for stereo surface rendering. Correspondingly, the restitution functions written for creating ortho images will subsequently be used to render an acceptable stereo view from a DEM and a single georeferenced image. Conclusion, those GIS and IPS designers and also their clients who buy into somebody else's independent soft photogrammetry approach are short changing themselves in the long range. The following are some of the positive, written comments to MicroImages during the last quarter exactly as written except for the [edit] alterations in [brackets] to keep them anonymous. Many more comments are received by MicroImages by voice but cannot be reproduced here verbatim as quotes. Please note that these quotations are not edited from their original form in spelling, grammar, punctuation, etc. from a prospective client in Spain on 22 July 1996. "Of course, I received, installed, and used TNTlite. I think it is a great software, and if fully fills our needs. However, we are a small university and we currently lack the money to purchase TNTmips. I am trying to obtain some funding because I am fully interested in the package. However, my work must go on, and I am looking for other cheaper packages with the same capabilities as yours...without success." from an email from a Polish client on 23 July 1996. "As an educational institution we use some systems in educational process and sometimes another in research (Genmap, Genasys, Erdas Imagine, ErMapper, Idrisi, MapInfo, Oracle database and of course TNTMips). TNTMips is not such popular in Poland as Erdas or ArcInfo, but in our opinion TNTMips is the most universal system we know. So we are interested in installation net version of TNTlite in our lab and buying additional CDs for student who would be using them at home. On the other hand we need full version of TNTmips for our research projects." [This email continues on regarding details of an additional new system to be ordered. Subsequently, this client has purchased an additional UNIX based TNTmips product (model U240 for five simultaneous users on an SGI platform).] from an email on 25 July 1996. "Just found a rave review about your GIS product on the Web, and am writing for more information." from a letter from a TNTlite user dated 7 August 1996. "I have been trying out TNTlite for a period of time and have been very satisfied with your product. TNTmips seems to be a complete package that supports all the needs for GIS software. I have been searching for a Swedish dealer, but have come to the conclusion that there isn't anyone on the Swedish market, and therefore I would like to introduce and sell the MicroImages products in Sweden myself." "I'm working in the county administration office with education and development work in the GIS sector, but am in the first phase of starting my own business, a company that will offer education and software development in the GIS-sector. I've tried several programs out and I've experienced that they either have been afflicted with a lot of limitations or have been far to complicated for the most users to operate. In light of this experiences and my knowledge about the Swedish GIS-software market, I'm firmly convinced that TNTmips have potential to become one of the most popular GIS products in Sweden." "During training courses in Arc/Info I've been holding I have noticed that the students have had problems with the user-interface. During my work at the county administration I've been using other programs, such as Arcview, Mapinfo, PC arcinfo, Idrisi and Map factory but I think that TNTmips is a better choice compared to them." [continued on with details relative to ordering] from a web site entitled FREE PROMOTIONAL CD-ROM OFFERS (PC Only). [This site is at (www.west.net/~cdromug/free/cds.shtml) and describes hundreds of unrelated free CDs and provides links to the sites where they can be ordered. MicroImages submitted a standard TNTlite kit to them for their review. Their current listing for TNTlite is as follows.] "MicroImages, Inc. (for students and small projects" "-offers their advanced TNTmips and TNTview products FREE for unlimited use in small projects. TNTmips offers a complete approach to geospatial data management and analysis: GIS, Image Processing, Surface Analysis, and other geospatial analysis and manipulation activities. (This is the public domain small project version of the same commercial package they sell for $5000 to $10000!) They request $25 shipping and handling fee which seems exorbitant for a freebie, but product is awesome.) 397MB - WIN/WIN95 7/2/96 - * = received and installed - grade = A = I love it!" ["Awesome" and "grade = A = I love it" are real good marks for TNTlite from a generic site listing all sorts of free unrelated CDs and which now links to microimages.com. These are words not usually bantered about in connection with massive technical products. In this site's current list of several hundreds of free CDs, their coveted "A" grade has only been assigned in addition to a couple of hot game CDs; the Lotus Notes trial CD; and Microsoft's 'TRIAL 95', a two CD set providing a 90 day free trial use of Windows 95, Office Professional, Project, and Plus. This puts TNTlite in pretty good company!] [However, one point does need clarification. The $25 charge gets you two CDs (original and next release if registered), two sets of materials, and two shippings even to any address outside the USA. This does not seem "exorbitant", but this webmaster does seem to run toward superlatives in this review.] from an order from New Zealand for TNTlite on 30 August 1996. "Incidentally your brochure arrived literally 3 hours before we were to order an alternative package for a newly equipped teaching laboratory. Needless to say we have held back the order .... Thought you might like to know that!" from an email from a client in Germany on 21 August 1996. "It's up to me to apologize, too. I never intended to blame your product. I have been working as a windows programmer and database administrator for the last 5 years (and I have been in computer business for more that 10 years now) and I really do appreciate your program because I know about the hard work that has to be done behind the scene (besides of all the special GIS stuff in the case of TNTmips)." "So let's state that our communication took place on a day that was not one of the best for both of us." from email from a prospective client in Italy on 26 May 1966 using V5.30. [The following is a portion of an email communication between two different nations and members of a multinational European team conducting a funded project to implement expert system and artificial intelligence software for geological exploration. Their new software will use someone's GIS software for its underlying geospatial data management component. This Italian company reviewing TNTmips (actually TNTlite at the time of this summary) is not a client of MicroImages but is responsible for selecting the appropriate GIS component and writing a significant portion of the higher level software required.] "We received the demo kit of TNTmips about a week ago. We had problem with the hardware key, so that up to now we could only work with the lite version of TNTmips (which does not require the key)." "From this first analysis of TNTmips we can make the following remarks:" "We tested TNTmips on different platforms: UNIX on SUN Sparcstation 20 and HP 712/80, and WINDOWS NT 3.51 on PENTIUM 100. On all the platforms TNTmips seems to run well, the performance of the software are similar in all the machines. We were surprised by the very little difference among the PENTIUM and HP platform, apparently the MicroImages X server (MI/X server) specifically developed for INTEL WINDOWS platforms works quite well. We make some test on it and proved to be complete and standard X server." "The TNTmips software is very complex, powerful and versatile so that a complete analysis of it requires much more time, but certainly it is one of the best tools for geography and geological analysis among commercially available GIS." "We could not yet test the Software Development Kits (TNTsdk) because we are still waiting for the hardware key which allows the use of the development kit. Up to now, the documentation we received explains that the software development is platform independent. As [a name] told us in his previous mail, you can write one set of C code using the X, Motif, and MicroImages libraries functions and then compile it once on each final machine. MicroImages supplies the X and Motif libraries for the PC, which are not sold by PC vendors." "According to the application notes we received you should use the WATCOM C compiler, but MIcroImages told us that nowadays they are using the MICROSOFT C compiler." "The only problem we found is the documentation which they send us up to now [for the TNTsdk] is rather old: the [the TNTsdk] Application Notes is a draft copy of June 1993, and Lacking: apparently only a reference manual is available." "The TNT software is based on one specific .RVC project format file, which handles RASTER, VECTOR, CAD, and Data Base objects [also TIN objects], and there are library functions which can manage this .RVC files without bothering [with] the internal structure of them. This file format seems to be suitable for our [a name] system." "TNTmips can create links with external database files, or can import database files in its .RVC files. There are a number of database format which are supported, we must match them with the database we will use in our system." "The evaluation that geological and geophysical experts about TNT product were very good and in fact the geographic calibration and georeference capabilities of the TNTmips seems to be very exhaustive." "The budget required for the TNTmips package is reasonable compared to the price of other products, and the UNIX platform license comprise [encompasses] the PC platform." "Our conclusion, from this first look to TNTmips, are quite positive. We are planning to try some simple software integration as soon as we receive the hardware key, if you are also planning to do some integration test we could plan together the tests." [This independent opinion is clearly from a competent technical person who in a short time with TNTlite had no difficulties in identifying many of the key technical characteristics of TNTmips. MicroImages has continued to discuss this advanced use of TNTmips with this group as the geospatial base for the construction of their system. MicroImages is optimistic that this will be the end result.] [In reaction to this opportunity and his comment regarding the status of the TNTsdk Application Note, MicroImages has just employed a new technical staff member. His sole initial responsibility is to upgrade and improve the Application Notes for the TNTsdk and SML. He will also create sample source code snippets and example code modules to include as sample models illustrating how to get started in using each of these approaches.] from a FAX from a MicroImages Dealer outside the United States on 27 August 1996. "I have just completed three training courses, one on GIS and two remote sensing/IP courses. These were well attended by a broad spectrum of mining industry personnel from a large cross-section of [a nation's] mining companies. For practical 'hands-on' parts of these courses, I made extensive use of TNTlite, and, compared to my past experiences of running practicals, I can only say that TNTlite is the prefect GIS/IP training product. It is fast, easy to use and, since it has the full TNTmips functionality, the training can be real, in-depth and as sophisticated as the course instructor wishes." "These courses have been my main TNTmips marketing vehicle over the past two months. I have had ardent ERDAS, ARCinfo, MAPinfo and ATLASGIS users on [in] these courses, and all, by the end of the course, have been convinced of TNTmip's broader capabilities, ease of use, greater stability and superiority when compared to these other systems. Every course delegate has left with a copy of TNTlite, and I hear that many of them are already using it successfully on small projects. One company [a name] major iron and steel mining/exploration company) sent delegates to all these courses, and the geologists insisted that the head of their computing division attend the last course, 'to see why they, as geologists, want TNTmips' in their organization, as opposed to ARCinfo and ARCview, which was the solution on offer. After only one session on TNTlite, this person was 'very impressed'; by the end of the course he was convinced. 'I don't know how our company can afford not to have TNTmips as one of our exploration tools.' I am therefore confident that we will shortly be getting an order from [the company]." "I personally prefer to distribute TNTlite in this manner (after a course) as I feel people then have a real appreciation for the value of the 'free product' they have received." "We are continuing our negotiations with [two organizations] (who at present have no I.P. or GIS capabilities). The ESRI agents continue to frustrate our efforts, promising these companies that ARCview2 will deliver that they want, and ARCview3 will have I.P. capabilities etc. I can only keep reminding them that TNTmips has had these capabilities for years, and that it is a product that offers them solutions now, not in the future on the back of uncertain promises. I think this message is beginning to sink home, especially as their geologists return from my courses with TNTlite and proceed to do things these companies still only dream of doing, on their own office/home PCs." an email exchange. question posted on the Internet from Alaska on 7 August 1996. "I'm looking for a program that can convert raster (scanned) info into vector based info that can be used by arc-info. I'm using a Mac Q650 (040 chip) but will be upgrading to a PPC. Any info appreciated. Thanks! Oh yeah, does anyone know if Arc-view can import / convert raster data." answer posted from a MicroImages client on 8 August 1996. "MicroImages TNTmips will run on a Mac (or Windows or UNIX) and can convert raster to vector. Check out their home page at http://www/microimages.com. Export to Arc/Info is also possible as TNTmips has many import/export routines." from an email from a U.S. client on 30 August 1996. "We finally got TNTmips installed on the unix system. Our key was first installed on the NT. I was waiting for the sysop to put it on unix. Just went to Solaris 2.5 SunOS 5.5 on a Sparc 20 server." "TNT INSTALLED WITH NO PROBLEMS! We were up and running in the 20 minutes it took to do a complete install. We have never had such an easy installation, and still get error messages from Erdas's Imagine and Arc/Info [on the same machine]." "Hats off to the product Developers at MicroImages." from an email from an Arizona client on 6 September 1996. [a user of TNTmips professional products] "For your information. I am going to use mips in my current class. It [TNTlite] will be loaded on 12 PC computers (and used for my two weekly lab sessions with 11 students each). Depending how it goes I may also use mips [TNTlite] for the following class sequence next semester." "I'll keep you posted." from an email from a long-time South African client on 11 September 1996. "I have [ERDAS] Imagine 8.2 for Windows 95 which I bought because I was offered a good deal [from its original purchaser]. Whereas it has excellent image processing capability and nice map layout tools, its vector handling routines are primitive compared to what TNT has to offer. I needed a second TNT type product but could not afford to pay upfront." [Since V8.2 is the latest version of ERDAS Imagine, one would assume at this point that the original purchaser of this ERDAS Imagine 8.2 had come to some sort of similar conclusion or result?] "I am going to return my Imagine 8.2 because it does not work for me. The DXF importer always bombs on me and it cannot print to my HP 820C printer. I try writing EPS files for large format bureau plotting and it write junk. I got frustrated and wrote an angry e-mail message and now I have been blacklisted by ERDAS. I have always been impressed by MicroImages professional approach to dealing with hotheads like myself. No grudges water under the bridge approach. All my problems were trivialized by ERDAS even though they were mission critical. I reported a DXF importing problem 8 months ago and still have no resolution of the matter. I bought the HP820C to discover that it is a non-supported device. Yet I was told by ERDAS engineers that Imagine 8.2 writes to any Windows 95 supported device. After months of wrangling, I freak out and then get blacklisted. I was told by the local distributor that ERDAS was unhappy with my attitude. To ensure my problem would get resolved I faxed a letter apologizing for my 'attitude' even though I do believe I had a right to get angry. No response to that. I may have an 'attitude' but I think ERDAS have a bigger 'attitude' problem." "I may have to speak to you about a second TNT license. In the long run, I have been seriously burnt by going the Imagine route." [MicroImages operates under the premise that all complex (and some not so complex) software products, including ours, will have problems, its really how it gets fixed that counts!] from an email from the same South African client on 26 October 1996. "... TNTmips version 5.3 is error-free as far as I am concerned. This is a far cry from the early days when patience was required. Thank goodness that is behind us. TNTmips imports any dxf file I throw at it, by the way. ERDAS Imagine (Windows/95 version) has very limited vector support - you can overlay dxf and arc/info coverages - thats all. ..." from an email from a user of TNTlite in Spain on 12 September 1996. "I have to buy a portable computer and one of the programs that I'm likely to use most is TNT [has TNTlite only]. According to TNT's performance, would you advise a Pentium-based machine or a Mac Power book? I've found TNTlite to be rather slow on a Mac Quadra 900 (16 Mb RAM)." [An example of someone who finds TNTlite sufficiently useful to influence the type of computer to acquire!] [In a previous MicroImages MEMO, it was documented that a Quadra or any Mac using a 68040 or earlier chip functions like a PC 486 or PC 386 at the equivalent megahertz. This Quadra runs all software like a PC 486/25 or /33. New Power Macs for the office give very similar TNTmips performance as the Pentium at the equivalent megahertz. So for an office setting, either a new PC or Power Mac will perform equally well if equally equipped and powered. However, a Pentium portable was recommended, as currently no Power Mac portable is available in the greater than 100 megahertz category, whereas, there are many choices of Pentiums of 120 or 133 megahertz.] [Now discussing buying TNTmips Pro for their workstations.] from a FAX from a German client on 11 September 1996. [This client site uses most of the features in TNTmips and has kept MicroImages well informed of errors in the past. These errors have been fixed, some fast, some later. Working with these and other supportive clients to improve the reliability of the TNT products has continued to the benefit of all of us.] "So far we did not experience any severe problems or even crashes with version 5.4! All out projects that we created and modified under previous versions can be fully used with version 5.4, this also holds true for Map-Layouts, legend objects and themes. However from our work with TNTmips we found some items that could be improved with the next coming version:" [A list of suggestions for enhancements and new features occurred next.] "In our recent Jordan-Project we have run TNTmips 5.3 on a laptop TOSHIBA CS100. This was a most vivacious solution in an environment of mainly heavy ArcInfo machinery. The availability of Import-Export tools, e.g. E00-format, made it the most efficient GIS known to our Jordanian partners." ['Vivacious' can mean 'lively' or 'agreeable', either of which would probably apply to introducing TNTmips on a low cost portable into a "big iron" ARC/INFO shop representing a lot of investment.] from an email from a South African client on 12 September 1996. "I thoroughly enjoy your product and only wish I had time to learn more of its functionality. Keep up the good work." from an email from an Australian client on 14 September 1996. "Thanks for the message. It is good to hear that IRIX 6.2 will not pose any problem. I am also glad that TIN processing [for a specific surface fitting process] has been upgraded and will be waiting for V5.5." "On a positive note I appreciate very much for the tremendous effort TNTmips [MicroImages] has put to improve the capabilities. I believe it is now one of the best integrated RS-GIS software." "I would also like to see that the other development specially STEREOSCOPIC MODELING with EDITING capabilities for drainage, ridge lines, escarpment/cliffs etc., TIN with breaklines, drainage etc. [This is one of the major foci of software development activity for TNTmips at this time.]" from an email from a new client in Germany on 16 September 1996. "Congratulations for your TNTmips software, by far the best GIS I ever tried!!" from an email from a Tasmanian student using TNTlite on 26 September 1996. "I was sent a copy of TNT-Lite because I had corrected some FUD on the Mapinfo-L [MapInfo list server], thank-you. I hadn't got around to registering it yet, largely because I hadn't had much time to play with it, but I got a note saying that version 5.4 was shipping and reminding me to register - thank-you again, and I might say that the difference between MapInfo's behavior towards customers and what I have experienced of yours does not reflect well on them!" [This email from the student user of TNTlite V5.30 goes on to document two errors. Both of which were corrected by V5.40, which he would already have been using if he had promptly completed his registration. And yes, he also got an email response from MicroImages' software support regarding his problems.] from a FAX from a MicroImages Authorized Dealer on 27 September 1996. "Although TNTmips came out of this review looking very good, I think that comparisons of TNTmips with IPS don't do it justice because they tend to concentrate on the features that all IPS aspire to. There is not enough emphasis on its GIS functionality. The statement that TNTmips provides good integration with GIS misses the point that TNTmips is a GIS and spatial images are just another kind of spatial information. Why buy an image processing system when you can get a GIS that does the same?" [This is why the cover statement was added to the reprints of this article which are being shipped with V5.50 of TNTmips. This cover statement looks at it from the viewpoint of price versus performance. The above paragraph addresses essentially the same idea but from the viewpoint of functionality. The above dealer concludes that while the review is correct in what is says, it is misleading in what it leaves unsaid about TNTmips and therefore also possibly about the other products.] "The main criticism of TNTmips these days seems to be the documentation. Personally I almost never use documentation for anything. I believe that learning to use a complicated system like TNTmips by trial and error is cheaper and more effective than a training course. You learn all sorts of things you didn't need to know, but which come in useful sooner or later." [Unfortunately there is a whole spectrum of those who buy software. This ranges from experimentalists to manual memorizers and from theoreticians to savants. So, we have some clients who read documentation cover-to-cover before they install the software. Others use it only as the last resort for reference when they cannot work it out for themselves.] from email from a client in Ecuador on 2 October 1996. "Note: We do own a current license of TNTmips. I plan on using TNTlite in a week long remote sensing workshop I am teaching for a week at the [a university]. Any suggestions? Last minute observations? If so please respond by email, but here's the rest of the information for the upgrades. I think this program will be a lifesaver for this seminar Û I had doubts about all the licensing / key trauma... It will also spread the word about TNTmips down here. Thanks!" from a FAX from a MicroImages Dealer outside the United States on 7 October 1996. "I have just returned from a trip to [a city], where I set up a TNTlite Lab in the Geology Department at [a university], and gave a one week intro. course on Remote Sensing/Geodata analysis. They are most impressed with the software and will be trying to get funding for purchasing one professional TNTmips license to master training data sets and for research purposes." [This FAX goes on to explain that the TNTmips key was stolen after the course but fortunately was insured. A new TNTmips key has subsequently been ordered. Insure your valuable property!] from an email from a U.S. client on 10 October 1996. "Its nice to see you take a genuine interest in your customers, this is one of the reasons I don't mind advocating your products - you can call MicroImages and get a real person at the other end." "Currently I am preparing for a new grant from the [an agency] (that is what I am buying the additional TNTmips license for) that involves updating 1978 land cover to 1996 land cover. The plan is to use TNTmips to georeference and rectify (as best we can) USDA 35mm color air slides and then update the 1978 vectors on top in TNTedit. The area in question is 31 townships surrounding the [a river] which is a [name] site. This grant augments the one I just finished in which we used TNTmips and PC Arc/Info (I would rather we had the second copy of TNTmips back then) to convert several paper data sources to digital form. The most time consuming layers were soils and National Wetlands inventory which is where TNTmips was a tremendous asset with its powerful raster to vector ability." "I have many other small projects going all the time with local government and other segments of the University. I am constantly consulting with our Geography and Geology departments and many students who come in to pick my brain. I have loaned the TNT CD to at least 3 students so far who have wanted to use TNTlite and they were quite impressed. The Geography Department just borrowed the CD earlier this week and plans to install TNTlite on 18 new Pentium 133's just bought." "Hopefully next year I will have time to publish a few articles on projects that use TNTmips." from a FAX from a client in Portugal on 11 October 1996. "I've been exploring the TNTMIPS (v. 5.4) capabilities and would like to give you my congratulations for it and for the excellent documentation you provide." from an email from a Canadian TNTlite user on 11 October 1996. "I am a graduate student at the University of [a name] and have been given the opportunity to help revamp the 3rd year course entitled Resource Planning and Management. This course currently uses OSUmap [?] to introduce the students to GIS. I would like to use TNTlite, a copy of which I was recently given by a fellow grad student. The professor for the course has approved, providing we can use the raster files originally used in OSUmap. I have from now till Christmas to develop a set of readily understandable instructions for a software package that I know virtually nothing about. Does MicroImages have anything available (eg in the form of a tutorial / simplified manual / example lessons / etc.) that I could get access to?" "In the past, the GIS component of the course has consisted of the 'typical' GIS functions of restraint mapping to determine the most suitable site for.... (some activity). This will not change. The professor, however, has agreed that as SOME students become more GIS literate, they will want, and should have, access to more sophisticated software so that they may explore the subject in more depth." "Any assistance you can offer me would be greatly appreciated." [By the time this class starts early next year, he will be using the powerful V5.60. In reaction to this kind of input from TNTlite (and TNTmips) users, you will find that more and simpler "how to" materials are becoming available. The first examples of this material are included with V5.50: Getting Started Displaying Geospatial Data; ABCs of Image Analysis with TNTlite; TNTlite Recipes; and several new practice TNTlite data sets. More materials of this type will ship with V5.60 for the beginning user.] from an email from a TNTlite faculty user in Australia on 17 October 1996. "We are quite excited with TNTlite and are wondering whether you can guide use in terms of the availability of text(s), tutorials that use TNTlite/TNTmips at a level suitable for introduction of GIS/RS to first year University students." "Perhaps you could point me to a particular site of provide a URL?" from an email from a TNTlite faculty user in Canada on 17 October 1996. "I am the Lab Coordinator for the GIS lab in the Dept. of [a name] at the University of [a name]. Currently we are running ArcView, MapInfo, and PCI [Easy Pace] for student training and research projects. The TNTlite package looked like a good, affordable alternative that students could take with them when they left. The potential is also present to get the full TNTmips package as an alternative to PCI." [It continues on with info about downloading and a problem getting installed.] from an email from a U.S. client on 22 October 1996. "Thank you for the e-mail. I like conversing about my TNT work. No one here (yet) to do that with." "I do not mind advising the person from South Africa. It will be good to talk about TNT, and especially the calibration and correction of TM data (my results were absolutely striking! I made natural color (3,2,1 RGB) and CIR (4,3,2) look like aerial photos, photos from 700 km up!) I miss the TNT chat line." "The SML script development you mentioned. Will this be specifically for converting from DNs to radiance? Or a more general linear calibration function(s) that could be included in SML scripts (ala the Focal functions, etc.)" from an email from a client in Germany on 25 October 1996. [The email starts with a description of a continuing troublesome problem and finishes with:] "4) Now I want to switch to a totally different matter. Unfortunately I did not find time to have a closer look at the new features of V. 5.4 until these days. I want to tell you that I am very pleased, especially with the vector combination module, the new options in 'measure' concerning rasters." "Reading about the planned features [for V5.50] concerning extended database capabilities and vector treatment makes me look forward to the future releases of TNTmips. Already now I think that TNTmips is the best value for the money in the market (I believe that I can judge this to at least some extent because, besides work, I am studying Geomatics at the [a university], Austria, where I learned a lot about what you can expect from the different systems that are around. Also, I am in touch with other participants who are using the 'big' systems like ArcInfo, Intergraph, and who are not satisfied with what they got pretty often." [So now TNTmips can claim to be the biggest 'little' system around.] from an email exchange at the end of October 1996. [Student on 29 October 1996] "Hello, I left a voice mail yesterday, and realized it would probably be easier for you to get a hold of me via the network. I am very interested in the software available from microimages, TNTlite. I am a graduate student at CU Denver in environmental science, as well as an employee of the USGS. I am currently building a GIS for a Nature Conservancy property in South Park that is to be used to catalog soil, water, and species cover data. The coverages are in ARC/INFO format and the data is being accessed in INFO using the CURSOR tool. My questions are:" "1) Does TNT support ARC file format?" "2) Is the database relational?" "As I am graduating soon, I would like to turn over the maintenance of the database to another student in the Biology Dept. However, it is unlikely he/she would have ready access to a workstation, ARC/INFO license, etc. I plan to migrate the GIS to the Mac/PC platform to solve this problem, but I am not satisfied with the current offerings from ESRI or MapInfo, especially given their relatively high cost. I would very much like to try your product, as it may be the solution I'm looking for." "Sincerely, [a name]" [Dealer on 30 October 1996] "Thank you for thinking of using e-mail - I received your phone message among a dozen others and this is definitely more efficient for me in term of response. I would like to assure you that if you are looking at passing along a project to someone and are considering a Mac or a PC as the vehicle, that TNTmips/TNTlite would provide the most ideal system for achieving this goal. TNTlite is just a scaled down version of TNTmips - but only limited in the size of the fields you can assemble to work on. The toolset is exactly the same in both packages - no modules! Also, no command line prompt - everything is operating in a windowing environment (currently under an X Windows Server) and you can work on the same project file even if you have a Mac, I have a PC, and someone else in the UNIX world. Other benefits are: the software is updated quarterly (yes every three months), technical support is free and human, you get to work with all the data structure in one system (raster, vector, CAD, database, TINs, text...), and I believe the system is easy to learn for the APPLIED person - the scientist, resource manager, field biologist, consultant, geologist, etc." "The cost of the TNTlite system (on CD-ROM) is $25.00. If you access the MicroImages web page http://tnt.microimages.com you can find out more about MicroImages products and even download TNTlite for free! Be careful though - you'll download a 60 to 80 meg file that will unzip to about 100 megs or so." "I would be more that happy to send you the CD-ROM package by mail or if you wanted to come by the office here in Boulder that's fine too. However, to send you a copy I need a mailing address, so if you supply me with that I'll get a TNTlite in the mail for you." "And to answer your questions - yes TNT does support the import and export of Arc files, as well as MapInfo, ERDAS, Intergraph, AutoCAD, and a whole slew of others. The database side of life is evolving towards what I would consider a truly relational level of existence but I believe you would be the best judge of those abilities through evaluating TNTlite." "I would love to have the opportunity to carry on a bit more with you about this wonderful toolset and approach to spatial data management and visualization - I'll make it a point to give you a call tomorrow." "Thank you very much for your inquiry." [Student on 31 October 1996] "Thank you very much for the information on your product. It just so happens that I was attending the Geological Society of America meeting yesterday and stopped by the MicroImages booth. Chris, (I don't have his card with me) was kind enough to show me a demo of the software, and I was able to purchase a copy of TNTlite. It is already installed on my Mac, and I have imported several ARC coverages. I think it took me 6 months to get to this point using ARC/INFO! From what I've seen so far, TNT looks like it may be solution to my problem and the problem many students face in trying to learn GIS." "Thanks again." [Dealer on 31 October 1996] "Glad to hear that you got to the MicroImages boot at GSA and got TNTlite from Chris. Also, I am quite impressed with you comments on how quickly you were able to implement TNTlite and will be forwarding this on to everyone at MicroImages (as well as Chris). I do hope you will take time to send in the registration card as this will formally set you up with MicroImages Technical Support. Also, I am available to provide local support and solutions to whatever you may encounter with using TNTlite. I would also like to encourage you to come up to the office here in Boulder if you would like to see some of the projects we have worked on over the last five years using TNTmips. [The dealer] works on a wide variety of environmental problems around the world and I believe you might find some of these projects of interest. Alternatively, I am available to come down your way and give a presentation on MicroImages products to any group that you may feel would be interested. Please let me know if either or both of these scenarios would help you or be of interest. Thanks again - happy TNTlite-ing!" [Student on 1 November 1996] "I would be very interested in coming to your office in Boulder, it is an easy bike ride across town from my house to your office. I have passed along the TNTlite CD to the computer guru in my group and he is going to give it a test run. I'm not sure if we were in the market of a new GIS, but I'll let you know if anything develops. Let me know when it is convenient for us to get together, my schedule is flexible." [This exchange summarizes better than anything MicroImages can say about how TNTlite can work for everyone involved. The following points can be summarized from this exchange: Ô This is an experienced student who knows right away how to get started in a new geospatial analysis system and what to expect from it. Ô Can there be any lingering doubt from this and other testimonials that TNTmips is now in direct competition with workstation and NT ARC/INFO? Many of these kinds of exchanges do not mention the subject of image processing or analysis. Ô A responsible and interested dealer who uses, knows, and believes in the products they sell can make a lot of difference. This dealer certainly had no direct financial incentive to work with this student, but all of MicroImages' dealers would react in a similar professional manner. I think many involved, including clients, have a motivation and evangelistic inclination to help newcomers begin to use geospatial tools, usually without ulterior motives. The above exchange once again demonstrates my formula for operating a satisfying business, regardless of how big or small it may be: start with a responsible company which manufactures good products; select and work with responsible dealers who use the product; and provide the best support possible to intelligent professional clients.] from an email exchange with a TNTlite user in Australia on 1 November 1996. [University on 1 November 1996] "The academic staff at this University have decided to use TNTlite for teaching purposes next year. There are a number of issues that I need to clear up with MicroImages about licensing." "Can you please advise urgently who it is that I should email in order to discuss the issues." [MicroImages on 1 November 1996] "You may get multiple response to this; sorry if it's confusing. There are NO licensing restriction on TNTlite. You can copy and distribute and use it freely on as many machines in as many places as you like." "We expect you might be interested in purchasing a full professional TNTmips in order to prepare special datasets for a student / lab use. That product we sell, and it's license is protected by a hardware key (which can be moved from machine to machine)." "Terry Peterson is the one to talk to about TNTmips purchases." "peterson@microimages.com" "Of course, if you limit yourself to the various sample datasets we provide, you can probably get away without purchasing a full TNTmips, professional version. Either way, we're glad you've chosen TNT. Let us know if you have questions." [MicroImages also on 1 November 1996] "Regarding your email above, the person that you would need to establish contract with is myself. How might I be of assistance?" [University - same party on 5 November 1996] "My responsibility is to support the academics at this university. As we are a multi campus with small numbers (about 3000 at each of 3 main campuses) we have centralized rather than faculty based computer laboratories. The majority of our students (another 16000 or so) study at a distance using a mixture of paper based materials and the WWW." "It has been decided that we will use TNTlite for GIS subjects at the three main campuses and also for distance teaching. My responsibility is to ensure that the product is properly licensed by the university, ie that we pay any license fees and observe any restrictions that may be placed on the use of any software product. I have had email mail from [MicroImages, see above] and [a Dealer] and my conclusion from their correspondence is" "1. that we are free to use TNTlite in any of our laboratories and on any academic's workstation. There are no license fees to be paid." "2. that we are free to produce our own data sets and include them with any teaching materials delivered on campus or at a distance." "3. that we can distribute copies of the TNTlite software and data sets to our students anywhere in Australia or a around the world." "It is our practice to include a disclaimer which would identify you as the owner of the software." "Please advise if any of these conclusions are incorrect. Thank you." [My response via FAX on 5 November 1996] "I can assure you that MicroImages does not require any licensing for the use or distribution of our FREE TNTlite product or any datasets anyone might produce for use with it. [Name of university] is free to distribute copies of the TNTlite in any form: Internet, CD-ROM, ... anywhere in the world. The only exception to this is that current United States law requires, and therefore MicroImages must restrict the delivery of all our products by any means to the embargoed nations of:
"If you wish to provide your standard University disclaimer by FAX or airmail I will be happy to sign and return it." "It may be of interest to you to reread my long MicroImages MEMO entitled Announcing TNTlite dated 25 March 1996 and which accompanies each TNTlite CD-ROM. It describes why TNTlite has been released into the public domain. It seems that this logic closely conforms to [a name] University's local and remote campus needs in supporting the teaching of geospatial ideas. About the only thing we would like is for someone to let us know from time to time how this product works for [a name] University." "You may wish to provide a TNTlite CD-ROM to your campus, and especially to your off-campus students as part of their course materials. Additional kits of the TNTlite CD-ROM and the accompanying printed materials can be obtained in quantity 10 at US$75 which includes airmail shipping. [A name] at your [a name] Campus has just ordered a set of 10." "I am sending you by airmail a preliminary copy of a printed flier advertising a color illustrated workbook to be used as a class exercise syllabus for use with the image analysis portion of TNTlite. This color syllabus, sample exercise data, and our TNTlite CD-ROM has been assembled by a California professor and a private Oregon Company and may suit your student tutorial needs. This complete kit of materials sells for U.S.$60 (manual, CD, datasets). It has been assembled to be placed in bookstores for direct sale as a complete unit to students enrolled in courses which use TNTlite for their laboratory exercises. At $60 it is being distributed near cost, but a lower quantity price might be negotiated with its publisher. Please contact them directly if this kit is of interest to [a name] University." "If you or the [a name] faculty should have any questions please contact MicroImages at any time. Free software support is also available to any user of TNTlite including students via email, FAX, or voice phone." [MicroImages greatly appreciates the concern of this university about doing things right. But, this exchange illustrates the most significant challenge encountered so far in promoting the unique concept of TNTlite. In general, potential students and especially professionals simply do not believe that it is a free with no strings attached working product. Most seem to think it is some sort of ruse until they try it, as at this university. The next hurdle is convincing Mac and Windows users that all these versions are combined on a single CD-ROM.] from email from a TNTlite faculty user in Indiana on 4 November 1996. "I am a Professor of Anthropology at [a university]. My general application area: Archaeology." "I will be using this software to process geophysical data (magnetic and resistivity site surveys) gathered during our archaeology field school. We are adopting TNTlite as the software of choice to teach our students in the field class. Based on our initial trials, it looks great and provides a future upgrade path to full-featured software." "To learn more about our field school, see our web page at: [a web address followed] and stay tuned - new maps crediting TNTlite will be appearing on that page soon." from email from a TNTmips dealer in Australia on 7 November 1996. "We recently used the colour binarization module to separate contours, streams and roads form a scanned map then we vectorized each layer separately. We were amazed at how effective it was. The map was of poor quality the paper turning yellow and colours were faded, but the colours separated perfectly and the results were no different from scanned repro mats(?)." "Our two TNTmips operators [names] are up to speed now and doing a wide range of GIS/RS work. We are building a reputation as geographic fix-it people. If data needs to be reformatted or transformed or otherwise processed, there is always a way to do it in TNTmips." [Continues on other topics.] from email from a new TNTmips client in Arizona on 7 November 1996. "I have been working with TNTlite for the past six weeks, and have been very happy with its capabilities, so much so that I have just ordered TNTmips. I have done a lot of work with ERDAS Imagine and have tested ENVI. The one feature they offer that I would like to see in TNTmips would be a link between the feature space (scatter plot, raster correlation) image and displayed raster file. Such a feature is very helpful in determining what class a pixel or group of pixels fall in. This is especially helpful when a limited number of ground truth sites are available, or when no ground truth sites are available." [Continues on to expand upon request for new feature.] from email from a TNTmips client in Sweden on 11 November 1996. "Ok I broke down and finally installed Win 95, giving Gates his pound of flesh." "Printing problem appears to be solved. Interestingly enough, before I installed TNT for 95, I ran 16 bit TNT under Win 95 and the problem was also gone." [From this it appears that the printing problem was in the setup of W31 which is common. Also, there has never been a 16-bit version of the TNT products, only a 16-bit W31.] "TNT is noticeably snappier under 95." [and the computer was not changed!] from email from a TNTmips client in Germany on 13 November 1996. "Using TNTmips, we have successfully finished our project mentioned in the communication of July 8, 1996 (an environmental assessment of federal road construction in Germany). This study surveyed an area of 175 km2, it describes its environmental factors and analyzes areas of different environmental sensitivity. The vector combination process of version 5.4, an essential tool for our study, worked quite fine, with large datasets imported from DXF and from ARC files." "We are struggling hard to promote the use TNTmips and its acceptance by authorities. You will be glad to hear that the project maps we produced with TNTmips have found very positive resonance at state authorities and public presentations. This week, the maps are used for poster presentation at a colloquium on EU [European Community] rules for road planning and environmental assessment. Among the participants are EU officers, German Federal Traffic Ministry officers, and consulting companies. Participants also get a paper with a detail colour print of our maps (see copy sent with this fax) and a text describing our work and the advantages of TNTmips as a GIS software for environmental assessment. Hopefully, I will also manage to write a paper for a German landscape and environmental planning magazine." "In August, I held a workshop for 20 participants of a one-year GIS and environmental informatics course at the Siemens-Nixdorf Training Center Munich (I had informatics training there three years ago). I used our project data for demonstrating the possibilities of TNTmips and its practical application in GIS analysis and map production. The students were very enthusiastic about the features of the software. This is why you should have received quite a number of orders for TNTlite, especially from a Mr. [a name], the course manager of the Training Center, and from Mr. [a name], a participant. Possible, the Training Center will use TNTlite as a course software for the next year's GIS course. I strictly recommended Mr. [a name] to use the system instead of ARCView." [MicroImages has indeed shipped more TNTlite kits to Germany than to any other country outside the United States.] "For you, the promotion of TNTmips means business. For us, it is important that our customers, mostly authorities, learn to know and appreciate the name and value of this software. Still, most German authorities think that GIS and ARCInfo are the same thing, and we have to mention this name in acquisition papers to get the jobs. This is why we try to make TNTmips more popular. However, our recent project results with TNTmips will certainly change the mind of several authorities." [Continues on to request new features. Note that this is a client and not a MicroImages dealer speaking. Again our clients are our best sales force. We appreciate the loyal support of our clients.] For simplicity, the following abbreviations were used in this MEMO: W31 = Microsoft Windows 3.1 or 3.11. NT or NT4 = Microsoft NT 3.1, 3.5, or 4.0 (3.1 is error prone and thus the TNT products require the use of 3.5 and its subsequent patches). W95 = Microsoft Windows 95. Mac = Apple Macintosh using the 68xxx Motorola processor and MacOS 6.x or 7.x. PMac or Power Mac = Apple Macintosh using the 60x Motorola PowerPC processor and MacOS 7.x. MI/X = MicroImages' X server for Mac and PC microcomputer platform and operating system.
11th Floor - Sharp Tower, 206 South 13th Street, Lincoln NE 68508-2010 USA Business & Sales: (402)477-9554 Support: (402)477-9562 Fax: (402)477-9559 Business info@microimages.com Support support@microimages.com Web webmaster@microimages.com | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||