## Sensor Silliness

Today I saw something highly amusing; someone claiming they can take video footage at:  1 000 000 000 000 frames every second (that’s one tera frame per second, or 10^12!)

I work in the sensors industry, and so I understand that there are a lot of challenges with making sensors that can take images at high framerates. As an example, the highest framerate you can buy “off the shelf” is somewhere around 6000 fps. That’s a lot – that’s enough to image many things a person wouldn’t see. It’s about the frequency that you start to see LED’s switching on and off at for non-trivial applications. If you want anything from 30 000 frames per second and up, you have to get a quote. You can buy continuous high framerate cameras in the few tens of thousands of fps, after that you have to start implementing some interesting tricks to get anything out. A cool one million frames per second can be done, but it’s about the limit to something you can actually buy. Any higher and you need your own electronic engineering design team!

The reason for hitting this limit is quite simple – it’s extremely difficult to control large amounts of electronics at that speed. Any reasonable image sensor usually has a minimum of three transistors in each pixel – if you want a HD image at that rate, that’s over 6 million transistors you need to operate simultaneously. This is possible, it’s just extremely difficult – to the point that operating at a few thousands of fps is hard; in the millions is very difficult indeed. The current technology we have could maybe manage to distribute a signal fast enough to support a nice round half a billion fps (and that’s a strech) – so we’re still not even within a factor of a thousand of that figure at the beginning of the article;

1 000 000 000 000 (claimed)

5   00 000 000 (electronics could do this)

Even if you could make some hypothetical hypersensor work at 1 000 000 000 000 frames per second, though, it would still be absurd. Why is that? Well, because of quantum, that’s why! In all seriousness, the major limitation here is that you’d have trouble getting enough light on the sensor in order to make an image. Take this example; a sensor might be 8cm by 8cm, and at a high resolution you might have a 2 micrometer pixel pitch (that’s roughly the width of the pixel). In normal (UK) sunlight, you’d have maybe 1mW per cm squared – that’s about what you can produce easily in a lab, or a TV studio. You can then calculate roughly how many photons (that is, the smallest possible unit of light) per pixel per frame.

1. The number of pixels
2. The amount of light on the sensor in Watts per m squared
3. The energy of a photon
4. Hence the number of photons hitting the sensor per m squared per second
5. Hence the number of photons per pixel per second
6. Hence (for 1 000 000 000 000 frames each second) the number of photons per pixel per second

I did a calculation like this; I got about 0.2 photons per pixel per frame – that’s so ridiculously small that were you to look at a single frame, you would see no more than about one fifth of the image at a time and that’s in a mathematically perfect imager (which can never exist).

Even if you were to have one thousand times as much light (ten times more than full, direct sunlight in the best possible conditions), you would still struggle to get anything better than an image taken on an old camera-phone. If you were lucky.

So be careful who you believe about very slow motion videos 🙂