Talk:Hermite interpolation

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Method: simple case[edit]

the section abruptly ends without letting me know how to get the polynomial from the divided differences table. could someone who understands it add it please :)

Untitled[edit]

I think the error section could use an example, but I'm not confident that my example is correct. Here's my understanding of it:

Using the example above, this would be , where x is the number at which the function is approximated and c is between -1 and 1 (the lowest and highest x-values used to produce the approximation). This function's maximum for any x is the maximum error of the approximation at that x value.

Can somebody verify/correct this?

I believe that it isn't suppose to be f to the ninth power but rather the 9th derivative of f. In which case, the error is zero. As you notice the Hermite Interpolation of that function came out to the exact same solution as the original function. You aren't suppose to use this interpolation technique when evaluating an polynomial because you are using polynomials to approximate it, so it is just more work than actually evaluating the original polynomial. This function was used to demonstrate the point that you can get very high level accuracy even though you are only using 3 points because of the extra information the derivatives provide. 159.242.239.116 (talk) 14:33, 5 November 2008 (UTC)[reply]

Wiki Education Foundation-supported course assignment[edit]

This article was the subject of a Wiki Education Foundation-supported course assignment, between 26 May 2020 and 3 July 2020. Further details are available on the course page. Student editor(s): Yifeng Li.

Above undated message substituted from Template:Dashboard.wikiedu.org assignment by PrimeBOT (talk) 23:23, 16 January 2022 (UTC)[reply]

Article Dependancy[edit]

This article can't be understood without reading the Newton polynomial article —Preceding unsigned comment added by Arbitrary18 (talkcontribs) 04:17, 29 September 2008 (UTC)[reply]

I've expanded the introduction and usage sections to hopefully clear things up and make it explicit where Newton polynomials are required. I also redid the example table showing how the first few values are obtained, so by studying that you can get a working knowledge of how the algorithm work. - from Andrew Poelstra 22:21, 21 October 2009 (UTC)

Todo at this point would be clarifying how general the interpolation is - my additions work mainly with the first derivative and don't go into detail about how to work with highers. Also, building Hermite polynomials from Larange polynomials needs to be explained. Those two things should eliminate the article dependency. - from Andrew Poelstra 22:21, 21 October 2009 (UTC) —Preceding unsigned comment added by Apoelstra (talkcontribs)

Cleaned up document so that interpolating higher derivatives is clear. Also fixed the error formula to make sense (the old one was impossible to read) and to match the one given in Faires. I removed the "may be confusing" warning because the article is understandable now. Todo: write another example (of a non-polynomial; say for , and prove the error bound. Also, if anyone knows a better textbook, that would be awesome. Faires is awful. — Preceding unsigned comment added by Apoelstra (talkcontribs) 21:28, 30 December 2010 (UTC)[reply]

Ruby implementation doesn't work[edit]

I c&p'ed the algorithm and tried it with

x = [ -1, 0, 0, 0, 1]
y = [ 0, 1, 0, -4/2, 0]

Output is

0 1 -1 -1 2

I used these coefficients to compute the interpolation at x=1. The result is 2, which is obviously wrong. (Last coefficient should be 1)

--Pberndt (talk) 16:13, 28 May 2009 (UTC)[reply]

Here's an working version. Unfortunately I don't understand Hungarian and I don't understand the difference between the scripts (yet).
http://www.mathworks.com/matlabcentral/fileexchange/14353 --91.5.211.65 (talk) 11:07, 31 May 2009 (UTC)[reply]
I created a working example for German wikipedia (Might be moved to de:Neville-Aitken-Schema. Translation help:
  • anfangsIndex means start index (If , to calculate you have to calculate instead)
  • funktionsWert is the value of f (or f' f..)
  • z contains basic function values, so if , holds.
--Pberndt (talk) 19:59, 31 May 2009 (UTC)[reply]
See [1]. -- Pberndt (talk) 14:02, 12 July 2009 (UTC)[reply]

Ouch. Without going (further) into original research, can you try this?

def min(a,b)
  if a<b then a else b end
end

def hermite(x, y)
  base = Array.new(y.length)
  q = Array.new(y.length)
  base[0] = 0
  q[0] = y[0]
  1.upto(y.length-1) do |j|
    base[j] = (x[j] == x[j-1]) ? base[j-1] : j   # point to corresponding y value
    q[j] = y[base[j]]                                         # fill with y value
  end

  1.upto(y.length-1) do |i|
    qq = q[i-1]           # in the loop below qq = q[j-1] from previous iteration
    i.upto(y.length-1) do |j|
      if x[j-i] == x[j] then
        qq, q[j] = q[j], y[min(j,base[j]+i)]   # do not go beyond i-th derivative
      else
        qq, q[j] = q[j], (q[j]-qq) / (x[j]-x[j-i])
      end
    end
  end
  [x,q]
end

def poly(x, hermite)
  xi, q = hermite
  y = q[0]
  c = 1
  1.upto(q.length-1) do |i|
    c *= x - xi[i-1]
    y += q[i]*c
  end
  y
end

I had placed the program in the article in good faith as a helper to decypher the tableau, in case anyone who wanted to improve the article, but maybe it's better left here (and the code above communicates the intention much better anyway). Balabiot (talk) 16:50, 18 August 2009 (UTC)[reply]

Works for me. You could place your code in the "Algorithm Implementation" wikibook and put a link into the article. --Pberndt (talk) 13:38, 31 October 2009 (UTC)[reply]

Added some references, but probably not in the best way[edit]

I've added two references (Hermite 1878 and Traub 1964). I should probably add Salzer 1960 as well, and Schneider & Werner. I've forgotten how to cite references properly, sorry---I'll come back to this and fix it soon if someone else doesn't get to it first.

More importantly I have noted that the discussion originally present was restricted to the case when all confluencies were identical. Traub says that this is the most important case, so this is not necessarily bad. But the more general case was already solved by Hermite. It's quite remarkable how often his formulae get reinvented.

I could cite my book with Nic Fillion---we have an extensive discussion of this point in Chapter 8---but I don't think it's necessary. But since I have code publically available for the general case, at http://nfillion.com/coderepository/Graduate_Introduction_to_Numerical_Methods/BHIP.mpl (where it has been for 8 years) maybe this is of interest. I see that people have been trying to code the divided difference method (which can be numerically less stable, sometimes surprisingly so). Rob.Corless (talk) 20:17, 2 June 2024 (UTC)[reply]

Edited to add: since the error formula in the original text was incorrect (no "plus one" is needed for the k_i) I corrected that, and added the error formula for the complex case. Since my book was handy, and had the formula, I added that. I would be perfectly fine if someone replaced that citation with another source. Rob.Corless (talk) 20:35, 2 June 2024 (UTC)[reply]

Drat. Now I see that I have mixed the styles of citation. Sigh. I'll fix that now. Rob.Corless (talk) 20:41, 2 June 2024 (UTC)[reply]
Ok, made the references uniform. Probably still not optimal.
Since I have cited myself, I expect someone will come along to object--I am happy if another source is found for the error formula, or for the stable method of computing Hermite interpolants.
I could expand this article significantly, given what I have written on this elsewhere, but I don't think I will do that today.
I'm not so happy with the divided difference method, or with the Spitzbart reference (which seems to me inferior to e.g. Salzer 1960 which is more general and also inferior to Traub 1964). But I'm not willing to rework the divided difference discussion, which seems based on the Spitzbart reference. Rob.Corless (talk) 21:19, 2 June 2024 (UTC)[reply]