From the README:
First extend memory_limit in your php.ini to 10240M (10gb) aprox. and also max_execution_time to 300 seconds.
This is needed to train on a corpus of about 10MB of data (with questionable copyright issues). I haven't checked, but it is not unlikely that similar requirements are needed for predictions on the trained model. So completely unsuited for a typical DokuWiki setup.
dodotori nevyork/newyoork -> will direct user to a page call "newyork"
That's not what a MarkovChain will do. It predicts the next word. Similar to what auto completion does on your phone.
What you want is fuzzy search. Things like levensthein distance would be better suited.