In this paper, we propose TopoNAS, a model-agnostic approach for gradient-based one-shot NAS methods that significantly reduces searching time and memory usage by topological simplification of searchable paths. Firstly, we model the non-linearity in search spaces to reveal the parameterization difficulties. Furthermore, we present a topological simplification method and iteratively apply module-sharing strategies to simplify the topological structure of searchable paths. Besides, we propose a kernel normalization technique to preserve the search accuracy. Experimental results on the NASBench201 benchmark with various search spaces demonstrate the effectiveness of our method. Besides, it proves the proposed TopoNAS enhances the performance of various architectures in terms of search efficiency while maintaining a high level of accuracy.