Size-Depth Tradeoffs for Boolean Fomulae

نویسندگان

  • Maria Luisa Bonet
  • Samuel R. Buss
چکیده

We present a simplified proof that Brent/Spira restructuring of Boolean formulas can be improved to allow a Boolean formula of size n to be transformed into an equivalent log depth formula of size O(nα) for arbitrary α > 1.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Size-Depth Tradeoffs for Boolean Formulae

We present a simplified proof that Brent/Spira restructuring of Boolean formulas can be improved to allow a Boolean formula of size n to be transformed into an equivalent log depth formula of size O(nα) for arbitrary α > 1.

متن کامل

Area–Time Performances of Some Neural Computations

The paper aims to show that VLSI efficient implementations of Boolean functions (BFs) using threshold gates (TGs) are possible. First we detail depth-size tradeoffs for COMPARISON when implemented by TGs of variable fan-in (∆); a class of polynomially bounded TG circuits having O (lgn ⁄ lg∆) depth and O (n ⁄ ∆) size for any 3 ≤ ∆ ≤ clgn, improves on the previous known size O (n). We then procee...

متن کامل

Average-case complexity of detecting cliques

The computational problem of testing whether a graph contains a complete subgraph of size k is among the most fundamental problems studied in theoretical computer science. This thesis is concerned with proving lower bounds for k-CLIQUE, as this problem is known. Our results show that, in certain models of computation, solving k-CLIQUE in the average case requires Q(nk/4) resources (moreover, k/...

متن کامل

Size-depth-alternation tradeoffs for circuits

A Boolean circuit is a directed acyclic graph with some designated input gates of fan-in zero and one designated output gate of fan-out zero in which all non-input nodes are labeled with or, and, or not. All or and and gates have fan-in two, and all not gates fan-in one. We assume that the gates of a Boolean circuit are arranged in layers; each layer consists of gates whose inputs come only fro...

متن کامل

Neural Computing with Small Weights

J ehoshua Bruck IBM Research Division Almaden Research Center San Jose, CA 95120-6099 An important issue in neural computation is the dynamic range of weights in the neural networks. Many experimental results on learning indicate that the weights in the networks can grow prohibitively large with the size of the inputs. Here we address this issue by studying the tradeoffs between the depth and t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Inf. Process. Lett.

دوره 49  شماره 

صفحات  -

تاریخ انتشار 1994