One of my favorite shows, The Walking Dead, has 2 more episodes left. What do you thinks gonna happen? Do you think there's going to be a 3rd season? Do you think they're gonna find a "cure" of some sort, or it's all a dream or something stupid?
I personally feel like they can't do much with the show anymore. Both seasons were just them staying somewhere, and going out of town and coming back. Some of my friends think there's a "Safe Place" like I Am Legend, but I have no idea. Hopefully they do something amazing, cause the mid-season ending was pretty good (I won't say it for the people who haven't seen it yet).
What are your thoughts on all this?