All the concerns about the behavior and stance of women in the church that has been described has its roots in something. Isn’t it possible that when women are put in any role in the church that God hasn’t commanded, the unhappy results are going to be far-reaching? Are the women writing the theological books breaking new ground, or is/was there a minister or recognized male theologian writing who has already said these same things? What is it about the theology written by women that makes it particularly desirable? I’m simply asking because I don’t know and haven’t given this any real consideration.