It has been written that current TV series have replaced films, and even literature, as the most successful modern way of telling good stories.
There are many arguments backing such an assertion. We have the amazing acting, the beautifully trimmed plots or the deep connections established between characters and viewers. In fact, those of us who keenly await the next release of our favorite TV series are a growing legion.
But, in addition to this, we also have something more pervasive: this brilliant series' ability to uncover aspects broadly recognizable in our daily lives and realities faced. There is no better example than Aaron Sorkin's anticipation of some of the key dynamics behind the 2008 US presidential election in The West Wing.
Other examples? There is The Wire's profound insights into the debacle currently faced by the newspaper industry as we knew it, or into the intricate factors and perverse incentives accounting for the decadence and partial failures of the US school system.
PS: back in 2008 (or 2009) my personal ranking of US TV series included Desperate Housewives, Six Feet Under, 24, House and the already mentioned The Sopranos. Four years on, that ranking is much tougher, but it definitely includes Mad Men, The Wire, The West Wing, Breaking Bad and The Good Wife, with honorary mentions to many others, like Curb Your Enthusiasm or Entourage for their originality, boldness and proven ability to make me laugh... :)