June 3, 2007 3:28am CST
Despite the fact that society, religion, and politics has shunned women for centuries, a lot of beliefs state that women are the supreme beings upon this earth, a symbol of divinity and the essence of humanity. If that's the case, why are women still bottom-feeders on the social hierarchy? Should males continue to dominate the world, perhaps because women are somehow intellectually less apt? What are your thoughts on the matter, ladies and gents?