It's not so much that caching requires more resources as it trades space for another resource, such as (processor) time or bandwidth.
In storing and reusing the results of (expensive) computations, additional space is used but time can be saved;
PHP accelerators are an example of this, as is
memoization. As the gaps between
processor and memory performance and
processor and I/O performance increase (processors have been outstripping the other two), space becomes more and more valuable. Consequently, computational tasks must be correspondingly more expensive to make caching them worthwhile. Due to PHP's architecture (at the end of a script's execution, all data is discarded), cached data can't be stored in memory, so some form of persistent storage must be used instead. Disk storage speed is extremely slow relative to processor speed, so that option usually isn't worth it, but relegating storage to a DB (which can be forced to store the data in memory) may be viable.
As for network caching, the factors affecting its utility are network latency (~ 10-100ms) and
goodput (~ 128kB/s-1GB/s for high speed, though not many connections will run at the upper end of that range), disk access time (
~ 6-26ms) and throughput (~ 50-400 MB/s), available storage space (currently unmetered, so I'm not certain what the limits are) and the hit rate. From the first four statistics, we can expect that fetching data from disk is (very) roughly 10-20 times as fast as from the network (which puts an upper limit on how much caching will improve performance), with latency being comparable. The statistical ranges given reflect the variance between hosts; on a given host, the statistics will be more consistent. Hit rate will vary wildly, depending on storage space and the exact order that items are requested. Hit rate will scale the upper limit on performance increase; with a hit rate of (e.g.) 50%, we should expect at most a 5-to-10-fold performance increase from network caching. If you're fetching small amounts of data (on the order of 10**5 B, roughly), service time won't be much improved by caching; if it takes more than a few seconds for the server to fetch data per request, caching should give a noticeable improvement in performance.
Network caching is a subset of caching data from I/O bound processes. Caching search results is another, which can be advantageous if there is a large amount of data that must be examined for each search.
Your best bet is to design your site without caching, but make it easy to add it later on, if necessary. It's best to optimize later in the development process, once you've got a solid design and a working system.