How can I determine the optimum carrier frequency for a given bandwidth and BER [for an OOK optical communication system ]?
There is no optimum carrier frequency.
It would be nice if you could avoid frequencies already in use -- for optical communication,
most of the time that's only the constant light from the sun,
100 Hz / 120 Hz light coming from ancient flickery fluorescent lights, and
30,000 Hz - 40,000 Hz light coming from modern fluorescent lights.
What is the best demodulator circuit for this?
I'm not sure you even want a demodulator.
Perhaps you can more-or-less directly transmit Manchester-coded data,
then recover with a high-pass filter (to remove sunlight) and a comparator.
There are many demodulator circuits that would work fine for this application.
There are several low-cost demodulator ICs available at the big suppliers, some even in easy-to-prototype DIP packages, that work similar to the part described in
ON Semiconductor Application Note AN531: "MC1496 Balanced Modulator".
( Actual Implementation of a Multiplier )
such as the SA612 (also called the NE612, SA602, etc.), the SA614, the MC1496, the TDA9881, MAX2685, BGA2022, TFF1017, SA58640, SA58641, MAX2682, UPC2757, etc.
You'll have to pick a carrier frequency somewhere in the range your demodulator can handle.
The SA612 seems to work well up to roughly 50 MHz.
What filter should I use?
The rule of thumb is to try to capture most of the energy of each pulse.
Filters should pass (preferably with a flat passband) the stuff you're trying to deliberately transmit,
and should block as much as possible interfering stuff.
There's a transition band between the passband and the stopband.
For optical communication, sometimes this transition band is really wide, so you can get away with using simple and low-cost RC filters.
(The lower parts of the frequency radio frequency communication spectrum have so many people trying to communicate at the same time that it forces the transition band to be narrow, which forces them to use more complicated filters).
With randomly positioned on/off pulses of width L seconds,
the frequency distribution looks like (1/L)sinc(f/L),
where sinc(x) = sin(pix) / (pi*x).
(I.e. the Fourier transform of a rectangle function is a sinc function).
The first zero (end of the main lobe) of that sinc frequency distribution is at 1/L Hertz.
Most of the energy of the sinc frequency distribution is in the main lobe -- between 0 Hz and 1/L Hertz.
If you choose to modulate that baseband signal by some carrier frequency -- perhaps something like the way the RC-5 protocol converts each pulse to a burst of square waves -- the main lobe has a bandwidth (between zeros) of 2/L Hz.
At 1 Mb/s, L = 0.5 microsecond for Manchester-encoded data (half the full bit-time).
So the bandpass filter in your receiver needs to handle at least 4 MHz of bandwidth between the IR detector and the demodulator.
After the demodulator the lowpass filter should pass (up to the first zero) at least up to 2 MHz.
To pass a minimum f of 2 MHz with a simple RC low-pass filter,
w = tauf =~= 12.6 Mradians/s;
and the RC time constant T = 1/w = RC =~= 80 ns.
So, for example, if I arbitrarily choose C = 1 nF, the maximum R is 80 Ohms.
The lowpass parts of your bandpass filter, being at a much higher frequency,
requires an even smaller time constant, which implies
a smaller R or C or both.
further comments
The Shannon–Hartley theorem and the Fourier transform revolutionized telecommunications.
Most EE programs run students through several semesters of classes to clear out common misunderstandings and to fully understand the ramifications of these simple-seeming ideas.
You might consider skimming through the relevant sections of a digital signal processing book.
Have I mentioned the RC-5 protocol?
You could do worse than to try to speed up each part of that (approximately) 562 bit/second protocol by (approximately) 1000 or 2000 times faster to get (approximately) 500,000 bit/second or 1 Mbit/second.
Each bit of data is Manchester coded into two half-bits,
with 32 pulses in one half-bit and the LED turned off for the other half-bit.
(That seems to require a 64 MHz clock to get a full 1 Mbit/second,
and lots of hardware can't go that fast --
perhaps fewer pulses per half-bit, more like Ronja,
or fewer data bits per second,
would make it easier to use standard off-the-shelf hardware).
You might also consider looking at the the RONJA project, especially the schematics. They use some unusual clever tricks to get their data rate up to 10 Mbit/s using low-cost parts.
The incoming 10BASE-T Ethernet has already Manchester coded each data bit into two half-bits,
and Ronja transmits one or 2 pulses per data bit.
(In effect, Ronja turns the LED on for 1 clock pulse per data bit, and the receiver watches to see if Ronja transmits another pulse or not to decide whether the data bit is a 1 or a zero).
Sounds like a fun and educational project. Good luck!