Got a high school kid who doesnt quite understand the horrors of war yet. They think just think its all the glorified USA killing nazi stuff and war is good, not the true terrors of it. Any suggestions of a movie or show would work really
Some that i had been considering are "Man Against Fire" from Black Mirror, Apocalypse Now, and maybe Platoon or something. Not really sure what though and was curious if there were some modern ones as well
[link] [comments]
source https://www.reddit.com/r/movies/comments/rriyn4/what_is_a_good_movie_or_show_that_shows_the/
Comments
Post a Comment