I think that most of us have a concept of what relationships must be like. I'm a woman, and I used to think, a long time ago, that romance and flowers and candles and all that somewhat superficial stuff, where important. Now, I realize that comprehension, understanding, respect and honesty are way more important. My boyfriend is not the most romantic guy, but he's the best boyfriend I've ever had because he really does show me that he loves me and is there for me. I don't think romance should be forgotten altogether, but I do believe that it doesn't really make a relationship. What do you think?