Okay, so this is admittedly a little bit of a sidetrack, but can someone please explain to me why the majority of Christians today seem so against environmental progress? I mean, I see how your faith will impact your politics when it comes to things like abortion. I totally get that. But seriously? When did conservative Christians decide to leave the environmental debate up to everyone else to fight over? Doesn’t God in the very first book of the Bible put humans in charge of taking care of the earth? Hello? I’ve only been a Christian for a few weeks now, and I know there’s still a ton I need to learn. I also know that my specific political leanings may not line up a hundred percent with the majority of evangelical Christians, and that’s fine with me. I figure that God’s judging me based