I was thinking that doctors destroy disease, fix broken bones, take out or rearrange bad body parts, but then I wondered, once all the nurses are gone, who will be there to help nurture and create healing?  The way I figure it, with science and business, we’re still only covering half of healing.  It’s the human connection, the art of healing, that really heals.  How did we let science and business take over Healing?  Have to think about it!