I don't use SuperCollider, but I assume it can shell out to a command line program, in which case the CDP programs can be pretty much used as is. But none of the programs themselves is real-time interactive; that would be a major project for someone to factor out the essential dsp code into a form that could be driven by a real-time engine. One of the many reasons for going to open-source is to encourage just that sort of enterprise.
All of the filter process can in principle run in real-time (always dependent on CPU power of course), and many of the spectral processes can too (as we showed commercially in "Spectral Transformer" for Cakewalk, and more recently in the streaming pvoc framework for Csound). Several of the distortion processes could in principle run in real time (possibly with some latency) where they do not either add or subtract wavecycles, but would need a lot of reengineering to handle wavecycle measurements, latency, and dynamic user interactions.
Many of the programs do multiple passes over either the input or the output., for such things as maximum amplitude computation and normalisation; these will convert either with great difficulty or not at all to a real-time streaming environment. This includes for example the granular synthesis/brassage programs.