Matching handwritten notes using computer vision and pattern matching

Detta är en Uppsats för yrkesexamina på avancerad nivå från Uppsala universitet/Avdelningen för visuell information och interaktion

Sammanfattning: What people take for granted is not as easy for computers. Being able tojudge whether an image is the same even though it has a differentresolution or is taken from a different angle or light condition is easyfor humans but much more difficult for computers. Today’s mobiles aremore powerful than ever, which has opened up for more hardware-demandingalgorithms to be processed. How to effectively match handwritten notesto eliminate duplicates in an application. Are there better or worsemethods and approaches, and how do they compare to each other? Can youachieve both accuracy and speed? By analyzing images taken at differentangles, distances, and lighting conditions, different methods andapproaches have been developed and analyzed. The methods are representedin various tables where time and accuracy are represented. Eightdifferent methods were evaluated. The methods were tuned on one datasetconsisting of 150 post-it notes, each imaged under four conditions,leading to 600 images and 1800 possible pair-wise matches. The methodswere thereafter evaluated on an independent dataset consisting of 250post-it notes, each imaged under four conditions, leading to 1000 imagesand 3000 possible pair-wise matches. The best method found 99.7%, andthe worst method found 62.9% of the matching pairs. Seven of the eightevaluated matches did not make any incorrect matches.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)