44

Most of us know $$\sum_{n=a}^b c_n=c_a+c_{a+1}...+c_{b-1}+c_b$$ Some of us know $$\prod_{n=a}^b c_n=c_a \cdot c_{a+1}...c_{b-1} \cdot c_{b}$$ A few of us know $$\underset{j=a}{\overset{b}{\LARGE\mathrm K}}\frac{a_j}{b_j}=\cfrac{a_1}{b_1+\cfrac{a_2}{b_2+\cfrac{a_3}{b_3+\ddots}}}$$ A select few are familiar with $$\underset{j=1}{\overset{b}{\LARGE\mathrm E}} \ c_n={c_a}^{{.}^{{.}^{c_b}}}$$ All of the above are related by difference equations of the form $$f(n+1)=g(f(n))$$ In fact I'd go so far as to say all of the above and more can be linked with the following notation (iterated composition) $$\underset{n=1}{\overset{k}{\LARGE\mathrm F}} \ f(n,x)=f(1,f(2,f(3,...f(n,x)...)))$$ (It should be noted that in addition to the above mentioned items, newton's method, Mandelbrot's equation, and logistic equation fall into this category, so this notation has incredible generality)

Questions: Can you derive Taylor like methods that allow for the expression of many functions in terms of these other lesser known forms of recursion? Does this play a role in why summations are more famous than these other forms of recursion?

Is convergence an issue, most people are familiar that once you move into difference equations you start having to deal with problems like chaos. Which forms of recursion preserve convergence? Is this a factor in why we don't use other things besides summation?

Is it a historical problem. Does the reason summation is more famous have to do with integration. Does this suggest there is an ALTERNATIVE way to integrate? For instance could an infinite product conceivably integrate a function to the same expression as a standard integral? In other words, can you develop a analogous integral with the above formulations (excluding product integral as it isn't equivalent). Considering multiplication can yield areas of squares and cubes, it wouldn't be to much of a stretch, right? I'm extremely interested in that possibility.

Preferences: I'd like references and examples. I'm not really interested in the "this isn't possible" kind of rhetoric, but I will accept proofs that disprove the existence of the above mentioned items.

Zach466920
  • 8,088
  • 1
  • 19
  • 50
  • 6
    One partial reason why the product is not very popular is because you convert it to a sum by taking logarithms. Where this is not possible, say field theory, I think products are about as common as sums. – Asvin Apr 07 '15 at 19:40
  • @Asvin That's a good point. Although, still that doesn't explain why we don't use products and convert to sums. Sadly the more interesting cases, that is K, E, and generalized recurrences are very specialized. – Zach466920 Apr 07 '15 at 19:46
  • @goblin currently editing for bounty... – Zach466920 Apr 09 '15 at 20:50
  • @goblin I made a header – Zach466920 Apr 09 '15 at 20:50
  • 1
    it seems to me you're exactly describing recurrence sequence $u_{n+1}=f(u_n)$. clearly the notion of sum/products are very special cases, because of the algebraic structure – Albert Apr 09 '15 at 21:00
  • @Glougloubarbaki correct, they are all recurrence relations. I mean the E is actually the exponential analogue of summation and product series, but the others are best understood as recurrence relations. – Zach466920 Apr 09 '15 at 21:53
  • 7
    http://en.wikipedia.org/wiki/Product_integral – Hasan Saad Apr 09 '15 at 22:14
  • @HasanSaad That's not really useful...considering that the function has to be holomorphic everywhere...its so inelegant...it almost insists this isn't the way...almost to stringent to be general. Although +1 for at least partially solving part of the problem. – Zach466920 Apr 09 '15 at 22:26
  • 3
    http://dml.cz/bitstream/handle/10338.dmlcz/401134/DejinyMat_29-2007-1_6.pdf Regarding your objection, you might want to check this article. As long as a function is Lebesgue integrable, it seems that the Lebesgue Product Integral exists as well, and thus it is "general enough", I believe. You might want to give it a more detailed read though. I didn't read it because I was asleep, but that's what skimming told me. Good luck in your quest. :) – Hasan Saad Apr 09 '15 at 22:38
  • @HasanSaad thank you. – Zach466920 Apr 09 '15 at 22:47
  • Update this method provides a way to integrate but is not equivalent to the definition of integration. Using the standard definition of integration and the product integral don't yield the same results. So I'm still interested in a product integration that does this, if it exists. – Zach466920 Apr 10 '15 at 17:05
  • 1
    The $F$ operator is related to an old [idea](http://math.stackexchange.com/questions/816749/raising-a-partial-function-to-the-power-of-an-ordinal) of mine. What you're talking about is actually the dual of what I asked about; you are interested in $(f_0 \circ f_1) \circ \cdots$ whereas my question asks about $\cdots \circ (f_1 \circ f_0)$. – goblin GONE Apr 10 '15 at 17:49
  • @goblin Good, I was afraid to put that in. I came across the idea from trying to solve quintic equations. I figured that if there wasn't an explicit solution, you could make one in terms of its self. Then I realized it was a dynamical system, and we all know how "chaotic" those can get. My method worked, but then I realized that Newton's method can work much faster in *most* cases. Of course using the method like [this](http://math.stackexchange.com/a/1222651/219489) never fails to impress...so now I've been encouraged to find out more about all of this, which is why this question exists. – Zach466920 Apr 10 '15 at 18:30
  • @rank That's fine. This question was less about getting an answer and more about learning about the subjects applicability, main results, and returns on effort (As I might want to look into it deeper). I found the Product Integral to be only a step in the right direction, but your hierarchy seems to be the way to define what I'm considering. – Zach466920 Apr 21 '15 at 14:14
  • With respect to the continued fraction: you'll want to look up Thiele expansion and reciprocal detivatives. – J. M. ain't a mathematician May 24 '15 at 15:06
  • @Guesswhoitis Looks interesting, but can you expand about a point, or do you have to have global knowledge? – Zach466920 May 24 '15 at 17:51
  • Well, Thiele expansion is the exact analog of Taylor expansion for continued fractions. In this case, instead of evaluating derivatives at your expansion point, you're evaluating reciprocal derivatives. – J. M. ain't a mathematician May 24 '15 at 21:09

3 Answers3

1

Maybe https://en.wikipedia.org/wiki/Iterated_function#Some_formulas_for_fractional_iteration and https://en.wikipedia.org/wiki/Ramanujan_summation can bring some light, but i think it's easy find oneself lost in greater levels of abstraction, without good references or specific examples.

  • I actually have investigated most of this. I was just to lazy to write an answer :) thanks, although why did you mention ramanujun summation? – Zach466920 Aug 07 '15 at 21:29
  • the issue is related to convergence, probably because one of the ramanujan's main topics was convergence, but i dont know exactly why i put it... – sigma2sigma Aug 07 '15 at 21:32
  • 1
    Oh, ok. Just in case your curious, the convergence of fractional iteration is more or less unproven. In addition, fractional iteration expansions don't include the iteration variable. Continued fractions have an analogous Taylor method. In addition to the product integral, the 'exponential integral' could be looked at. I'll accept your answer, have a good one :) – Zach466920 Aug 07 '15 at 21:40
1

The abstraction you hint at is a standard example in Scheme or LISP programming. This is from page 64 of Abelson and Sussman's classic Structure and Interpretation of Computer Programs (http://web.mit.edu/alexmv/6.037/sicp.pdf):

Exercise 1.32.

Show that sum and product (exercise 1.31) are both special cases of a still more general notion called accumulate that combines a collection of terms, using some general accumulation function:

(accumulate combiner null-value term a next b)

Accumulate takes as arguments the same term and range specifications as sum and product, together with a combiner procedure (of two arguments) that specifies how the current term is to be combined with the accumulation of the preceding terms and a null-value that specifies what base value to use when the terms run out. Write accumulate and show how sum and product can both be defined as simple calls to accumulate.

This construction doesn't address any of your questions about convergence or integration. It can be further generalized to deal with potentially infinite streams of values to be combined, should you wish to pursue that.

Ethan Bolker
  • 80,490
  • 6
  • 95
  • 173
0

These are all just variations on looping, which appears in protean guises in programming.

ncmathsadist
  • 47,689
  • 3
  • 75
  • 127