I went to a panel discussion at the Southeastern Wisconsin Book Festival titled "What Do Women Want?". I was intrigued by the title and wasn't sure what it would be about. There were a couple of genres represented on the panel -- Literary Fiction, Romance, and Women's Ficiton. The gist was that publishers are putting authors in genres such as Women's Fiction to sell their books. A marketing ploy? And why is there not Men's Fiction as a genre?