Automatic recognition and classification of tasks in robotic surgery is an important stepping stone toward automated surgery and surgical training. Recently, technical breakthroughs in gathering data make data-driven model development possible. In this paper, we propose a framework for high-level robotic surgery task recognition using motion data. We present a novel classification technique that is used to classify three important surgical tasks through quantitative analyses of motion: knot tying, needle passing and suturing. The proposed technique integrates state-of-the-art data mining and time series analysis methods. The first step of this framework consists of developing a time series distance-based similarity measure using derivative dynamic time warping (DDTW). The distance-weighted k-nearest neighbor algorithm was then used to classify task instances. The framework was validated using an extensive dataset. Our results demonstrate the strength of the proposed framework in recognizing fundamental robotic surgery tasks.