Information cost functions

  • Hrvoje Šikić
  • , Mladen Victor Wickerhauser

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

A best orthogonal basis for a vector is selected from a library to minimize a cost function of the expansion coefficients. How it depends on the cost function and under what conditions it provides the fastest nonlinear approximation are still open questions which we partially answer in this paper. Squared expansion coefficients may be considered a discrete probability density function, or pdf. We apply some inequalities for pdfs to obtain three positive results and two counterexamples. We use the notion of subexponentiality, derived from the classical proof of an entropy inequality, to derive a number of curious inequalities relating different information costs of a single pdf. We then generalize slightly the classical result that one pdf majorizes another if it is cheaper with respect to a large-enough set of information cost functions. Finally, we present inequalities that bracket any information cost for a pdf between two functions of norms of the pdf, plus a counterexample showing that our result has a certain optimality. Another counterexample shows that, unfortunately, the set of norm-type pdfs is not large enough to imply majorization. We conclude that all information cost functions are weakly comparable to norms, but this is not quite enough to guarantee in general that the cheapest-norm pdf majorizes.

Original languageEnglish
Pages (from-to)147-166
Number of pages20
JournalApplied and Computational Harmonic Analysis
Volume11
Issue number2
DOIs
StatePublished - Sep 2001

Keywords

  • Best basis
  • Concave
  • Entropy
  • Majorization
  • Nonlinear approximation
  • Pdf
  • Rearrangement inequality
  • Schur functional
  • Subexponential
  • Wavelet packet library

Fingerprint

Dive into the research topics of 'Information cost functions'. Together they form a unique fingerprint.

Cite this