Skip to content

Final Project for the Modules: Effective Programming Practices taught by Hans-Martin Gaudecker and Scientific Computing taught by Philipp Eisenhauer.

License

Notifications You must be signed in to change notification settings

ArbiKodraj/ML-Approximation

Repository files navigation


Final Project for the courses Effective Programming Practices and Scientific Computing | Winter 20/21, M.Sc. Economics, Bonn University | Arbi Kodraj

Function Approximation via Machine Learning Methods

Code Style: black

The notebook FunctionApproximation.ipynb contains my work for the final project for EPP and Scientific Computing. It deals with different approximation methods for functions that vary in their complexity. The core methods originate from Machine Learning. This project aims to convince the audience that modern Machine Learning methods are as good as conventional methods for approximating mathematical functions (for some functions even better). Therefore, the audience should consider using them for approximation-related problems.

At this point, I would like to mention that I use both object-oriented and functional programming. This is because it was only in the context of this work that I acquired project-oriented coding on my own and was able to improve it through the book "Fluent Python" written by Ramalho (2015). Also, I apply the acquired knowledge from the Effective Programming Practices and Scientific Computing courses to some extent, which is why I think there is a significant difference in my coding quality. To show my learning effect, I still decided to include the worse implementations from the code point of view. This should only serve as an explanation for the different coding styles.

The best way to access this notebook is by cloning or downloading it and open it locally via jupyter notebook. Alternatively, it can be viewed here: Binder, nbviewer (although the latter one is not recommended because of poor rendering).

For my code's documentation, I used Google-style docstrings and created a HTML file via Sphinx, which requires the napoleon extension. Since the repository was private, I could neither access GitHub pages nor readthedocs for hosting the HTML file. Therefore, the documentation has to be opened locally. For this and for replication purposes, I recommend cloning the repository and activating its environment after the git initialization as follows:

$ git clone https://github.com/ArbiKodraj/ML-Approximation.git
$ cd ML-Approximation
$ conda env create --file environment.yml

Make sure that the environment Project_ArbiKodraj is listed now:

$ conda env list

Then, the environment can be activateted and deactivated using:

$ conda activate Project_ArbiKodraj
$ conda deactivate 

As soon as the repository has been localized, the corresponding file can be found in the following directory:

./docs/build/html/index.html

Furthermore, I have prepared tests for the last section using the Unittest library involved in Python. I decided to use Unittest because it seems to be very usable for object-oriented programming. The tests can be found in the test folder. For the execution of the tests, I recommend using Pycharm or Visual Studio Code.

Course Instructor Effective Programming Practices : Hans-Martin Gaudecker

Course Instructor Scientific Computing : Philipp Eisenhauer

Reproducibility

In order to ensure full reproducibility, I have set up a continous integration environment using Travis Ci which can be checked here: Build Status

Notebook's Structure

  • 1. Introduction: Introduces the paper's objective and structure
  • 2. Motivation of Interpolation: Briefly motivates the use of approximation/interpolation
  • 3. Application of Interpolation: Demonstrates two approximation strategies as convential interpolation tool
  • 4. Neural Networks as Modern Machine Learning Method: Illustrates the use of Neural Networks as an alternative tool for approximating functions
  • 5. Further Machine Learning Methods: Presents additional Machine Learning methods and their usefulness in approximating functions
  • 6. Economical Application: Uses discussed approximation tools to solve economical problems related to function approximation
  • 7. Conclusion: Summarizes the key insights and contrasts the illustrated approximation tools

Continuous Integration

References

  • Athey, S., 2018. The impact of machine learning on economics. In The economics of artificial intelligence: An agenda (pp. 507-547). University of Chicago Press.

  • Brynjolfsson, E., Mitchell, T. and Rock, D., 2018, May. What can machines learn, and what does it mean for occupations and the economy?. In AEA Papers and Proceedings (Vol. 108, pp. 43-47).

  • Chaboud, A.P., Chiquoine, B., Hjalmarsson, E. and Vega, C., 2014. Rise of the machines: Algorithmic trading in the foreign exchange market. The Journal of Finance, 69(5), pp.2045-2084.

  • Cybenko, G., 1989. Approximation by superpositions of a sigmoidal function. Mathematics of control, signals and systems, 2(4), pp.303-314.

  • Gasca, M. and Sauer, T., 2000. Polynomial interpolation in several variables. Advances in Computational Mathematics, 12(4), pp.377-410.

  • Gasca, M. and Sauer, T., 2001. On the history of multivariate polynomial interpolation. In Numerical Analysis: Historical Developments in the 20th Century (pp. 135-147). Elsevier.

  • Goodfellow, I., Bengio, Y., Courville, A. and Bengio, Y., 2016. Deep learning (Vol. 1, No. 2). Cambridge: MIT press.

  • Heaton, J.B., Polson, N.G. and Witte, J.H., 2016. Deep learning in finance. arXiv preprint arXiv:1602.06561.

  • Hendershott, T., Jones, C.M. and Menkveld, A.J., 2011. Does algorithmic trading improve liquidity?. The Journal of finance, 66(1), pp.1-33.

  • Hopfield, J.J., 1982. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the national academy of sciences, 79(8), pp.2554-2558.

  • Ketkar, N. and Santana, E., 2017. Deep learning with python (Vol. 1). Berkeley, CA: Apress.

  • Lin, H.W., Tegmark, M. and Rolnick, D., 2017. Why does deep and cheap learning work so well?. Journal of Statistical Physics, 168(6), pp.1223-1247.

  • Miranda, M.J. and Fackler, P.L., 2004. Applied computational economics and finance. MIT press.

  • Moocarme, M., Abdolahnejad M. and Bhagwat, R., 2020. The Deep Learning with Keras Workshop: An Interactive Approach to Understanding Deep Learning with Keras, 2nd Edition.

  • Murphy, K.P., 2012. Machine learning: a probabilistic perspective. MIT press.

  • Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V. and Vanderplas, J., 2011. Scikit-learn: Machine learning in Python. the Journal of machine Learning research, 12, pp.2825-2830.

  • Ramalho, L., 2015. Fluent python: Clear, concise, and effective programming. " O'Reilly Media, Inc.".

  • Tegmark, M., 2017. Life 3.0: Being human in the age of artificial intelligence. Knopf.

  • Virtanen, P., Gommers, R., Oliphant, T.E., Haberland, M., Reddy, T., Cournapeau, D., Burovski, E., Peterson, P., Weckesser, W., Bright, J. and van der Walt, S.J., 2020. SciPy 1.0: fundamental algorithms for scientific computing in Python. Nature methods, 17(3), pp.261-272.

License: MIT