The Walking Dead ending in the season 11 finale set up the future of the franchise after more than a decade of brutal deaths, undead action, and post-apocalyptic drama. Though the zombie apocalypse promised to "Rest In Peace," The Walking Dead's series finale proved neither peaceful nor rested.