Is it safe to say that sex and violence not only sells in the movies and on television but also in books as well. Grant it I am probably the most sexual and caveman like individual on this planet but it disturbs me when writers think a literary work is all sex and violence. Is that how we see society that the only way we can grab their attention is with sex and violence with no plot or depth to the story at all. And for African Americans authors it seems that if we are not writing about sex, violence, or God that we are handicapped and I think thats the farthest thing from the truth. It seems everyone is trying to be Tyler Perry instead of being your own individual, to give diversity to not just black readers but all readers. If you are just trying to sell books let it be known but I think as authors we are much more than that, we have to be more than that to gain credibility in our field of choice.