What makes a documentary “important”? What makes it worth referencing, or remembering, or even watching in the first place? Why, in this time of seemingly perpetual sociopolitical strife, would we veer away from the vaunted, glorious escapism of big feature films and go see something small and rooted in the real, instead?
Documentaries can be a hard sell, but it’s one that’s getting easier all the time. Once viewed as something stiff and obligatory, documentary film has, in recent years, risen to the top of the heap—thanks in no small part to some of the earth-shaking, needle-pushing, and ultimately world-changing films that are listed here, which find their focus in war, love, sex, death, and everything in between. And as for this list—its only qualifier is that these are the critically acclaimed, historically important, and pivotal films that a person who cares about film (and in doing so, often cares about humanity, in general) should really get to know.