Newton Series, Coinductively
نویسندگان
چکیده
We present a comparative study of four product operators on weighted languages: (i) the convolution, (ii) the shuffle, (iii) the infiltration, and (iv) the Hadamard product. Exploiting the fact that the set of weighted languages is a final coalgebra, we use coinduction to prove that a classical operator from difference calculus in mathematics: the Newton transform, generalises (from infinite sequences) to weighted languages. We show that the Newton transform is an isomorphism of rings that transforms the Hadamard product of two weighted languages into an infiltration product, and we develop various representations for the Newton transform of a language, together with concrete calculation rules for computing them.
منابع مشابه
Error estimation of fuzzy Newton-Cotes method for Integration of fuzzy functions
Fuzzy Newton-Cotes method for integration of fuzzy functions that was proposed by Ahmady in [1]. In this paper we construct error estimate of fuzzy Newton-Cotes method such as fuzzy Trapezoidal rule and fuzzy Simpson rule by using Taylor's series. The corresponding error terms are proven by two theorems. We prove that the fuzzy Trapezoidal rule is accurate for fuzzy polynomial of degree one and...
متن کاملInfinitary $\lambda$-Calculi from a Linear Perspective (Long Version)
We introduce a linear infinitary λ-calculus, called `Λ∞, in which two exponential modalities are available, the first one being the usual, finitary one, the other being the only construct interpreted coinductively. The obtained calculus embeds the infinitary applicative λ-calculus and is universal for computations over infinite strings. What is particularly interesting about `Λ∞, is that the re...
متن کاملFunctional Logic Programming with Generalized Circular Coinduction
We propose a method to adapt functional logic programming to deal with reasoning on coinductively interpreted programs as well as on inductively interpreted programs. In order to do so, we consider a class of objects interesting for this coinductive interpretation, namely regular terms. We show how the usual data structures can be adapted to capture these objects. We adapt the operational seman...
متن کاملOn the Use of Quasi-Newton-Based Training of a Feedforward Neural Network for Time Series Forecasting
This paper examines the e ectiveness of using a quasi-Newton based training of a feedforward neural network for forecasting. We have developed a novel quasi-Newton based training algorithm using a generalized logistic function. We have shown that a well designed feed forward structure can lead to a good forecast without the use of the more complicated feedback/feedforward structure of the recur...
متن کاملRational Interpolation and Basic Hypergeometric Series
We give a Newton type rational interpolation formula (Theorem 2.2). It contains as a special case the original Newton interpolation, as well as the recent interpolation formula of Zhi-Guo Liu, which allows to recover many important classical q-series identities. We show in particular that some bibasic identities are a consequence of our formula.
متن کامل