For desktop computing the effects are negligible. However, in industrial scale file and webservers this becomes more relevant as the faster you can read files and databases the better. For example, Google must have huge databases of data copied hundreds of times across multiple countries. A search engine is so effective because it can read copious amounts of data and return relevant results in a very short space of time. Large databases like that would just not be effective with standard transfer rates, so they have fast connections with fast drives that are also duplicated many times.
It is through this mix of quick speeds and efficient code that search engines can return results in well under a second from a database with billions of entries. Suddenly, those transfer rates seem little more relevant.