Recently introduced Transformational Semantics (TS) formalizes, restraints and makes rigorous the transformational approach epitomized by QR and Transformational Grammars: deriving a meaning (in the form of a logical formula or a logical form) by a series of transformations from a suitably abstract (tecto-) form of a sentence. TS generalizes various 'monad' or 'continuation-based' computational approaches, abstracting away irrelevant details (such as monads, etc.) while overcoming their rigidity and brittleness. Unlike QR, each transformation in TS is rigorously and precisely defined, typed, and deterministic. The restraints of TS and the sparsity of the choice points (in the order of applying the deterministic transformation steps) make it easier to derive negative predictions and control over-generation. We apply TS to right-node raising (RNR), gapping and other instances of non-constituent coordination. Our analyses straightforwardly represent the intuition that coordinated phrases must in some sense be 'parallel', with a matching structure. Coordinated material is not necessarily constituent - even 'below the surface' - and we do not pretend it is. We answer the Kubota, Levine and Moot challenge (the KLM problem) of analyzing RNR and gapping without directional types, yet avoiding massive overgeneration. We thus formalize the old idea of 'coordination reduction' and show how to make it work for generalized quantifiers.