Like him or not, many believe that Tiger put Golf back on the "map." Be it from having bigger crowd draws or increasing prize money for the tournaments. Tiger and his dominance were so prominent over the years, that many put the two hand in hand; Professional Golf & Tiger Woods.
Do you think without the Tiger Woods era that Golf would be where it's at currently?
Do you think without the Tiger Woods era that Golf would be where it's at currently?