ARDID: A Tool for the Quality Analysis of VHDL based Designs

نویسندگان

  • Y. Torroja
  • C. Lopez
  • M. García
  • T. Riesgo
  • E. de la Torre
  • J. Uceda
چکیده

In this paper, a tool developed to help in the VHDL design flow of complex systems on silicon is presented. This tool aims to help designers and Project Managers to improve the quality of their VHDL based designs. The tool includes functions oriented towards design management and also functions to analyse the quality of the VHDL descriptions from different points of view. These functions together will help to improve the global quality of a design and to reduce the development time. In the paper, an overview of different quality attributes normally used to assess the quality of a VHDL description is given. Based on this overview, the selection of quality attributes that are implemented in the tool is presented. Then, a brief description of the tool developed by the authors is given. Finally, some experimental results and conclusions are exposed. This tool has been partially developed within the TOMI project. 1. Quality Attributes of VHDL Designs The quality of a product, i.e. a digital design, is a fuzzy concept that can be considered from different points of view. Usually, the most accepted definition of quality is the fulfilment of explicit or implicit requirements. Although there are several definitions of quality [1]-[3], “quality can not be well defined, but it can, and should be modelled” [4]. In a broad sense, the final quality of a product can be understood as the result of the quality of the design process and the quality of the product itself. The quality of the design process is influenced by several factors: management, design skills, available tools, good documentation, design methodology, etc. In order to assess the quality of the design process it is not only necessary to analyse the process itself but also to measure the quality of the product at every moment. In the case of VHDL based designs, the quality of the final product can be analysed considering the VHDL descriptions. As it has been commented, the features to consider depend on the quality requirements of the design. Anyhow, there is a common set of features normally considered [5]-[7]. Instead of giving a precise definition of these features, a more “for use” definition applied to VHDL descriptions is given: • Maintainability: is an indicator of the effort needed to modify or correct a design. Maintainability is mainly related with readability, design organization and design tool capabilities. 1 This work is partially funded by EC (ESPRIT #20724). Partners: ES2, LEDA, TI+D, SIDSA and UPM • Readability: reflects the ease of a description to be read and understood by a person. It is related with aspects like complexity, name conventions or profusion of comments. • Complexity: reflects how difficult it is, or has been, the development or interpretation of a description. This feature is related with aspects like code size, nesting level, degree of modularization, etc. • Portability: it is a view of the ease of a design to be used in an environment different from the one that generated it. It can be considered from different points of view, portability between tools, between design target technologies, between users or applications, etc. • Reusability: is an indicator of the ease of a design to be used as part of a bigger design, or to be adapted to a new application. It is related with aspects like portability, maintainability, and degree of parameterization or ease of integration into the design flow. • Simulation performance: it reflects the efficiency of the simulation process for a design. Code complexity, modularization degree, or number and type of data objects are factors that directly affect the simulation performance. • Compliance with guidelines: this feature reflects the degree with which certain rules and guidelines have been followed during the development. These guidelines can affect, among others, name convention, design style, code complexity or any other feature of the design • Synthesis efficiency: it reflects the quality of the hardware obtained when the design is synthesized. It can be considered from the point of view of design performance (in terms of area, power consumption or delays), or from the reliability point of view. • Testability and verifiability: testability is related with the ease for a design to obtain a set of patterns for manufacturing test. Verifiability reflects the ease to develop a good test bench to verify the design functionality. In order to assess these features, it is necessary to identify some attributes of the descriptions that can be quantitatively or qualitatively measured. These attributes can be subsequently divided in sub-attributes that are easier to measure or assess. Finally, the set of measures obtained from these sub-attributes will be the input of a function that globally assesses the considered feature. Several VHDL quality analysis methods can be found in the literature [8]-[16]. Some of them are static analysis methods (mainly those imported from software engineering) [8]-[11], while other use dynamic techniques (especially those dealing with functional validation) [15], [16]. Most of the quality analysis methods are based on checking the compliance of VHDL descriptions with respect to certain VHDL coding guidelines. On the contrary, most of the quality checkers included in ARDID (and described below) are mainly based on the analysis of a simplified version of the hardware that will be synthesised from the descriptions, as it will be seen in section 2.3. 2. ARDID: a VHDL Quality Analysis Tool ARDID is a graphical front-end environment specially designed to work with VHDL designs. The main objective of the tool is to provide the designer with methods to increase the quality of the final design and to have automatic methods to help in the review of the architectural and logic design stages of a VHDL design process. The tool includes four main functions: a source code Version Control System, a VHDL Library Manager, a VHDL Design Quality Toolkit, that includes several quality checkers, and a VHDL Validation Quality Tool, that analyses the quality of the test-benches used for the validation of the design. The tool also includes integration capabilities that allow the user to call external programs (simulation, synthesis) as part of the environment. In the following paragraphs, a brief description of each part is given. 2.1 The Version Control System (VCS) In our experience, there is a lack of culture in VHDL designers to use any version control system. This is normally due to the lack of integration of these tools under the same design environment they use. For this reason, a source code Version Control System has been integrated under ARDID. The Version Control System is a graphical shell on top of GNU’s Revision Control System (RCS) [17]. The shell allows the user to graphically see which files are under control, which are editable, only readable, easily check in and out files, see differences, and almost all functionality that RCS has. 2.2 The VHDL Library Manager (VLM) In order to easily apply the quality checkers and manage the design, ARDID includes a graphical library manager with the most typical functions normally included in this kind of tools. The user can create, delete or link libraries, compile source code into the library, edit source code, update compiled versions and obtain scripts to automatically recompile the libraries. The Library Manager is a graphical shell on top of LEDA’s VHDL System [18]. 2.3 The VHDL Design Quality Tool Kit (QTK) The VHDL Design Quality Tool Kit aims to detect design methods or VHDL constructs that are likely to produce problems in latter phases of the design. This normally implies a detailed analysis of the VHDL code in order to detect this kind of problems and to obtain an overall view of the design quality. This quality is measured based on its accordance with a set of rules, stated for expert designers, that will avoid unpleasant surprises at the end of the design process. Aspects related with code layout, naming convention, signal clocking and initialisation, and many others are normally covered. Although not difficult, this is a tedious task. The use of the proposed quality checkers will greatly help in this task, doing these checks automatically. These quality metrics are not only useful for reviewers but also for designers, helping them to produce better VHDL descriptions and to reduce the number of iterations in the design process. Reusability is another important objective of every design. The high cost associated with a design is doing final customers and design centres to invest a lot of effort making designs reusable. This implies the design should satisfy certain requirements in order to do the design industrially reusable. The use of the VHDL Design Quality Tool Kit will help the designer analysing his/her designs in order to fulfil some of these aspects. For the final users the checkers will help them to analyse the quality of the macro-cells they are using without the need of a deep knowledge of their functionality. These tools have been implemented on top of LEDA VHDL System tools (LVS). LVS is a set of tools that provide access (through an application procedural interface) to the VHDL intermediate format (VIF). The results of the checkers are presented to the user in different ways. First, it is possible to obtain a hierarchical text description of the results. Through the use of hyperlinks, the user can access to a more detailed information of the result, that is finally back-annotated on the VHDL source code. In the current release, the VHDL Design Quality Tool Kit provides the following checkers to analyse the VHDL description: Sensitivity List Analysis: When developing VHDL code, sometimes it is difficult for designers to be aware of errors or omissions in sensitivity lists. This may cause errors that are only discovered after synthesis, delaying the project development. The Sensitivity List Analysis points out those erroneous or “to pay attention” processes, providing the following results, depending on the signals read in the process: • There is no sensitivity list • There are only clock and reset in the sensitivity list • There are signals read in the process that are not in its sensitivity list • Reset or clock signal are not appearing in the sensitivity list of a sequential process Architectural Description Style: the description style has influence on code maintainability and synthesis results. On the other hand, behaviour embedded in structural descriptions makes the code cumbersome. This analysis will help designers by providing them with a classification of the unit checked according to the following possibilities: • Description with component instantiations only • Description with component instantiations and simple signal assignments • Mixed descriptions with components and behavioural statements (warning) • Behavioural descriptions • Data-flow descriptions Object usage: While coding VHDL descriptions, designers often declare objects that are not used anywhere. This aspect worsens not only code maintainability but also the detection of hidden errors (like, for example, an address strobe signal that is declared but, afterwards, does not take part of the chip select decoding block). Designers who want to avoid this problem find this task heavy and monotonous and not always successful; therefore automatic search is needed. The Object Usage Analysis reports unused ports, signals, variables and constants or generics. Hard Coded Integer Values: Code reusability requires hard code values in the description to be substituted by constants or generics. In this way, modification of the module characteristics may be carried out easily. Integers hard coded in the VHDL descriptions are detected and highlighted. A “clever” algorithm is used in order to report not all values, but only those interesting for the designer (e.g. the lower limit in range constraints, like in BusWitdh downto 0, or values in increment/decrement expressions of counters, like in Count <= Count + 1, are not reported). High Impedance Signal Analysis: In some ASIC design methodologies or technologies the use of tri-state signals is not allowed or recommended. This analysis will highlight those signals that are going to be synthesised as tri-state buffers. Clock and Reset Analysis: Clock and reset schemes are one of the most critical issues in a design. For designers it is important to know not only which signals are registered by a clock, or initialised by a reset signal, but also whether the clock or reset may cause problems due to glitches, etc. For every clock described in the design this analysis points out: • Whether the clock comes from combinational or sequential logic, or from a port • Signals and variables triggered by the clock signal • Whether the clock drives combinational logic, data of flip-flops, etc. For every reset described in the design this analysis points out: • Whether the reset comes from combinational or sequential logic, or from a port • Signals and variables initialised by the reset signal • Whether the reset is source of combinational logic, data of flip-flops, etc. • Dynamic or static initialisation Clock Frontiers Analysis: clock frontiers are a source of problems in designs with several clocks. It is important for designers to know which signals are in the clock frontiers, in order to pay special attention to their behaviour. For every registered signal in the design, the checker analyses if the data of a memory element registered by one clock comes from a memory element registered by another clock, and points out the signals and the clocks implied in the clock frontiers Registered Objects Analysis: in the design process, it is important to know which signals are going to be registered by a flip-flop or latch. This is normally known after synthesis, but the synthesis process usually takes too much time to use it as a check method. This checker quickly detects whether a signal or variable will be registered with a flip-flop or a latch. Registered Outputs Analysis: when a design is being synthesised in a module basis, timing problems may arise when putting all modules together. These problems are minimised if the output signals of a module come directly from a flip-flop. This checker analyses the outputs of the modules and returns: • Outputs coming directly from a flip-flop. • Outputs coming directly from a latch • Outputs that are combinational function of the module inputs • Outputs that combinationally depend on other internally registered signals Combinational Feedback Analysis: combinational feedback is a source of problems when a circuit is designed. Usually, designers do not insert combinational feedback when they describe a module. But this situation is more difficult to detect when all modules are put together in the design hierarchy. Most of the synthesis tools detect this kind of problems when they deal with the whole circuit. But again, the time spent in the synthesis could be prohibitive to apply this check regularly. Connectivity Browser: when the checkers report a problem, sometimes it is not enough to know the objects (signals or variables) involved. The use of a connectivity tool that allows tracing the signal path, may help to discover the source of the problem. The connectivity browser will show the data path or control path of a signal in a graphical way, back-annotating the VHDL sentences implied in the path directly on the source code. Most of these checkers accept full VHDL, although the checkers that deal with clocks, reset, and registered objects only deal with the VHDL subset and hardware semantic considered in the IEEE proposal for synthesis of VHDL descriptions [19] (mainly the Synopsys® subset with some limitations). These checkers are based on a simplified model of the hardware that will be obtained after synthesis. In this model, memory elements are explicitly obtained, and only dependencies between signals are considered for combinational parts (see Figure 1). Objects of this model are stored together with the VHDL Intermediate Format to allow back-annotation. The checkers perform the analysis on this model accessing data through a procedural interface.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A method to perform error simulation in VHDL

∗ This work has been partially funded by TOMI project (ESPRIT #20724) Abstract: This paper describes a method to perform error simulation to estimate the quality of the testbenches used to validate VHDL designs. The method is based in the mutation of VHDL descriptions by an error model. The proposed method allows an automatic execution of the error simulation using a commercial VHDL simulator. ...

متن کامل

Some Notes on Critical Appraisal of Prevalence Studies; Comment on: “The Development of a Critical Appraisal Tool for Use in Systematic Reviews Addressing Questions of Prevalence”

Decisions in healthcare should be based on information obtained according to the principles of Evidence-Based Medicine (EBM). An increasing number of systematic reviews are published which summarize the results of prevalence studies. Interpretation of the results of these reviews should be accompanied by an appraisal of the methodological quality of the included data and studies. The critical a...

متن کامل

Design and Application of a Formal Verification Tool for VHDL Designs

The design of Control and Instrumentation (C & I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into subparts and implementing each subpart in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description La...

متن کامل

Enhancing a VHDL Based Design Methodology with Application Specific Data Abstraction

VHDL has successfully been introduced into the design methodology for VLSI ASICs. This paper describes a high-level data abstraction and supporting tool that enhance this methodology in the telecommunication application domain. A significant performance gain was obtained by introducing the data abstraction outside the VHDL simulator. The enhanced methodology has been used in current ASIC design...

متن کامل

Investigation of Mechanical Features at Macroscopic Scale for Friction Stir Welding of Polypropylene Strengthened with 40% Glass Fiber

In this paper experimentally, the friction-stir welding of the polypropylene sheets with 40% glass fiber has been investigated. Comparison to other welding methods, the strength of the joint is the most important feature in this process. Many parameters such as tool geometry, rotational speed, linear velocity, and tilt angle are very important as input parameters in this type of welding. Theref...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999