Weighted Kernel Regression for Predicting Changing Dependencies

Steven Busuttil and Yuri Kalnishkan

(2007)

Steven Busuttil and Yuri Kalnishkan (2007) Weighted Kernel Regression for Predicting Changing Dependencies. Machine Learning: ECML 2007, 4701 (). pp. 535-542. ISSN 0302-9743

Our Full Text Deposits

Full text access: Open

Full Text - 272.23 KB

Links to Copies of this Item Held Elsewhere


Abstract

We want to make predictions in the online mode of learning for data where the dependence of the outcome y on the signal x can change with time. Standard regression techniques give all training examples the same weight; however, it is clear that older examples are less representative of the current dependency. Therefore, we require methods that consider the information content of examples to decay with time. We propose two methods for doing this: one naive and another, which is based on the Aggregating Algorithm (AA). Surprisingly these two techniques are computationally similar. To measure the empirical performance of these new methods, we perform experiments on options implied volatility data provided by the Russian Trading System Stock Exchange (RTSSE). In these experiments our methods perform better than the proprietary state-of-the-art technique currently used at the RTSSE.

Information about this Version

This is a Draft version
This version's date is: 08/09/2007
This item is peer reviewed

Link to this Version

https://repository.royalholloway.ac.uk/items/90850a37-133a-4cdf-ba81-fb29f4a193f6/1/

Item TypeJournal Article
TitleWeighted Kernel Regression for Predicting Changing Dependencies
AuthorsBusuttil, Steven
Kalnishkan, Yuri
DepartmentsFaculty of Science\Computer Science
Research Groups and Centres\Computer Science\Computer Learning Research Centre

Identifiers

doi10.1007/978-3-540-74958-5

Deposited by () on 30-Mar-2010 in Royal Holloway Research Online.Last modified on 06-Jan-2011

Notes

(C) 2007 Springer Verlag, whose permission to mount this version for private study and research is acknowledged.  The repository version is the author's final draft.

 

References

1. Hull, J.C.: Options, Futures and Other Derivatives. 6th edn. Prentice Hall (2005)
2. Vovk, V.: Competitive on-line statistics. International Statistical Review 69(2)
(2001) 213–248
3. Hoerl, A.E.: Application of ridge analysis to regression problems. Chemical Engineering
Progress 58 (1962) 54–59
4. Busuttil, S., Kalnishkan, Y., Gammerman, A.: Improving the aggregating algorithm
for regression. In: Proceedings of the 25th IASTED International Conference on
Artificial Intelligence and Applications (AIA 2007), ACTA Press (2007) 347–352
5. Aizerman, M., Braverman, E., Rozonoer, L.: Theoretical foundations of the potential
function method in pattern recognition learning. Automation and Remote Control
25 (1964) 821–837
6. Saunders, C., Gammerman, A., Vovk, V.: Ridge regression learning algorithm in
dual variables. In: Proceedings of the 15th International Conference on Machine
Learning, Morgan Kaufmann (1998) 515–521
7. Gammerman, A., Kalnishkan, Y., Vovk, V.: On-line prediction with kernels and
the complexity approximation principle. In: Proceedings of the 20th Conference on
Uncertainty in Artificial Intelligence, AUAI Press (2004) 170–176
8. Sch¨olkopf, B., Smola, A.J.: Learning with Kernels — Support Vector Machines,
Regularization, Optimization and Beyond. The MIT Press, USA (2002)
9. Beckenbach, E.F., Bellman, R.: Inequalities. Springer (1961)


Details