Most people who are purchasing an asset tracking/management application are implementing a formal system for both the first time in their organizations and for the first time in their careers. This can make it difficult to weigh the importance of different feature sets. More broadly, there is a range of system design elements that are either nuanced or easily overlooked that have major implications with user experience or product longevity. To provide some guidance, here are several areas to consider.
Today’s software environments allow the asset data to be transferred to a mobile device where transactions are entered (similar to the desktop). When the device is then synched with the host, all transaction activity on both the host and the mobile device is updated with current results. This approach is convenient because any changes can be recorded in either platform and updating is virtually invisible. This convenience may have appeal, particularly for smaller data bases where the performance latency created by large databases does not occur.
Transaction-based data collection collects transactions affecting assets as a group and transfers them to the host. A separate verification step reviews the transactions, identifies any inconsistencies or errors and applies them to the assets. This approach provides a far greater degree of transparency and control for all transaction activity. The overall speed of the data collection process is improved because no downloading or repetitive synching is required.
Which you choose depends on the relative importance assigned to convenience versus control. Small environments of a few hundred assets where most work is done by a single person will likely see little difference between the two. As the complexity of the asset environment grows in terms of assets, locations, users and audit requirements, speed latency and weaker control environments will tip the balance toward transaction based data collection.
One of the goals of implementing a formal asset tracking application is to create consistently accurate data. To do so requires that, where possible, data be entered once and reused as needed throughout the application. This can be accomplished in at least two ways. The first is to auto-fill fields with pre-assigned values. This speeds up data entry by completing several fields by referring to a single data element.
The second separates description data from asset-specific information and connecting the two through the use of a relationship. Because descriptions are created only once, care can be given to make each entry as complete and accurate as possible. Additions and changes to the descriptions can be monitored to confirm that consistency continues and an audit trail of any changes can be established. Data extracted based on descriptions managed in this way are, by design, always consistent and accurate. Finally, a relational approach leads to significant efficiency in field data collection for new assets.
On the surface, there may appear to be little difference between the two approaches because the data is seemingly completed as required. However, unless the initial database is created through a conversion, the efficiency of the relational approach will immediately begin to reduce operating costs and provide more accurate data than a flat file approach. As time passes, these differences will grow significantly.
No aspect provides better evidence of informed design than the thoroughness of reporting capabilities. It is impossible to predict how reporting requirements may change in the future, so it is imperative that the selected software provide broad flexibility in reporting. The following should be considered imperatives:
Ultimately, reporting will strongly influence the user’s experience, and more importantly, it will dictate the life span of the product itself. Robust reporting should be considered non-negotiable.
Transparency in the system design is often overlooked in the evaluation of software. By transparency, we refer to the capability the system has to completely record all transaction activity, provide its visibility, and provide necessary checks and balances to control misclassification or misappropriation of assets. Consider the following points:
Often systems will only record major transactions — adds, relocations and retirements — rather than tracking changes to descriptions and user-defined data. The usefulness of user-defined date is severely compromised as the result.
The common practice in many organizations is to evaluate software based on a list of present requirements. If that list focuses on current needs of a specific department or category of assets, it may ignore the larger organizational needs that may be present or may develop in the future. The sad reality is that often in larger organizations there will either be several tracking systems in place that are neither compatible nor comprehensive. In other cases, a department may be forced to use a tool that has already been purchased regardless of fit.
The dilemma, of course, is that the current evaluation effort may lack the mandate or the organizational mechanics to assess broader requirements, much less the ability to predict future needs. The solution is to evaluate any system based on its ability to conform to a wide range of user interfaces, reporting, processing, data and security requirements, without having to modify the software. This software utility factor can easily be a deciding factor in the final selection process.
At a minimum, it is best to challenge each solution you are considering to several what-if scenarios — What if another department chooses to use the software? Can the solution address the needs of an entirely different category of assets without splitting the database? What are the security implications if usage is expanded beyond present requirements? In this way you can gauge how adaptable the solution might be for use in the future.
Defining the speed of a software program can be difficult because it shows up in many different areas. A good way to get a feel for throughput is to load the approximate number of assets in the tool and check out the speed. While we will not pretend to identify all the areas to consider, here are several areas worth evaluating:
Is the system easy to navigate or are forms repeatedly opened and closed to traverse the system while performing routine transactions?
When the Internal Revenue Service changed the regulations regarding Di Minimis assets, most asset management systems had no ability to comply. Read our white paper to learn the implications on these rules and how best to achieve compliance at a low cost.