Just to add to or clarify whats already been said - when an 802.11 device is confronted with co-channel users (other routers/devices on the same center channel as the one it is tuned to) within its reception range it will try to negotiate with those other "alien" devices for shared airtime (oversimplifying to get the general point across, I know). Ideally, assuming that all of those other co-channel users can hear each other and act accordingly, then there should be a reasonably effective sharing of channel bandwidth between all nearby co-channel users. Of course, in the real world many factors make this less effective than ideal but, to a degree, it works ok.
The problem with off-frequency devices which spill a large chunk of their transmitted modulated power envelope onto adjacent channels is that then you don't get that "nice" (ideally) co-interaction between the devices. The "bleedover" is seen as noise by an on-channel device which it must just work around and cannot negotiate with. In other words, the on-channel device must deal with a high noise floor on its channel caused by all of the overlapping gunk from the devices that are close in channel/frequency but not on-channel. It cannot negotiate with those off-channel devices. This noise is then essentially treated the same as any other random non-negotiable noise from say microwave ovens, cordless phones, Bluetooth devices, and non-802.11 non-co-channel wireless cameras.
Stevech may be correct about the IEEE channel designations - I may have heard/read wrong that the FCC did create those old 5MHz channel designations but either way, they don't match in terms of useable bandwidth with what is now used for 802.11 devices. So the devices use multiple "channels" when they operate - you may be centered on channel 6, for example, but are really using a five channel spread to work effectively. So, just going to channel 7 with another nearby device only means you get the worst case of causing both the channel 6 and channel 7 devices to hear each others' noisy spectral energy bleedover and treat it as a high noise floor thereby lessening overall wireless network performance in both cases. It would actually be better to have those two devices on the same center channel so that they could negotiate with each other (but, of course, not better than putting the two devices on channels spread significantly far apart). Hence, for nearby devices, it is good practice to at least separate each device's working center channel by 4 or more channels. So we get the 1, 6, and 11 "standard" in the US. You certainly can legally operate on any of the other channels and, in some cases, it may be a better option for you depending on what your neighbor's wifi devices are doing but, ideally, if all could be coordinated, the 1-6-11 scheme is what to shoot for.
So, in choosing a channel, it's a balancing act between keeping the overall noise floor as low as possible and sharing the on-channel spectrum with co-channel users. In most cases, if channels 1, 6, and 11 are all being used close by you then you usually pick the one channel of the three that seems the least used or used by the most distant users and operate there. But, since many devices are left in an "auto" channel setting for choosing channels (which, I think, in most cases is not based on a well thought out algorithm) and because of less knowledgeable users who don't understand the nature of 802.11 communication you may be faced with many adjacent overlapping channel occupying devices. In such case it's really a "just do your best" approach wherein you usually pick the least "bledover" channel (a channel with a minimum of overlapping channel energy) and try to stick with 1, 6, or 11 unless it's just not practical (for example, if you find that your band is so saturated with users spread all over the 11 channels it might work best to choose a channel occupied by a strong nearby user that your device can then "negotiate" with - just trial and error).
-Mike