Go to Top

Technical FAQs

What is the basis for the Optalysys technology?

Using diffraction and Fourier Optics, coupled with our novel designs, we are able to combine matrix multiplication and Optical Fourier transforms into more complex mathematical processes, such as derivative operations. In place of lenses, we also use liquid crystal patterns to focus the light as it travels through the system. This means the tight alignment tolerances that exist through the system are achieved through the dynamic addressing in the software.

All modern computers have multiple processor cores which run in parallel - how does the Optalysys approach differ?

For parallel functions, such as the Fourier Transform, each number in the output is the result of a calculation involving every number in the input. Dividing the processing tasks between multiple processor cores for such tasks results in complex data management issues as each processor core must communicate with the others and data must be buffered into local memory. This creates challenging coding problems and only incremental improvements as the resolution is increased.

The Optalysys technology operates truly in parallel, using the natural properties of light and diffraction. Numerical data is entered into the liquid crystal grids (known as SLMs or Spatial Light Modulators) and is encoded into the laser beam as it passes through. The data is then processed together as the beam is focussed or passes through the next optical stage. Increasing the resolution of the data is achieved through adding more pixels to the SLM, but the process time, once the data is addressed, remains the same regardless of the amount of data being entered. The Optalysys approach therefore provides a truly scalable method of producing large calculations of the type used in fluid dynamics modelling (CFD) and correlation pattern recognition.

How is the Optalysys technology benchmarked against electronic methods?

The simplest benchmark to use is the FLOPs (Floating Point Operations per second) benchmark used for electronic processors, although this is not an ideal comparison as it does not take into account the supporting infrastructure requirements, or the specific process it is being judged for. Roughly speaking, a two-dimensional FFT (Fast Fourier Transform) process takes n^2log(n) operations, where n is the number of data points in the grid. Based on this, our first demonstrator system, which will operate at a frame rate of 20Hz and resolution 500×500 pixels and produce a vorticity plot, will operate at around 40GFLOPs. However, this will be scaled in terms of both frame time, resolution and functionality, to produce solver systems operating at well over the PetaFLOP rates that supercomputers are quoted at – leading to ExaFLOP calculations and beyond.

Optalysys technology is quick at CFD calculations, so what?

Optical techniques can make all stages of the CFD process more effective. We are far from a pure optical CFD solver but it can help us generate, and learn from data better. We are always thinking about the full CFD process and learning. This leads to big data, again another exciting opportunity.

Will you be developing an optical CFD solver?

That would be the ultimate project to get involved with and we would love to partner up to do this. Our focus is to prove initially that optical technologies can improve and complement what is already being done digitally.

Do the optical numerics offer anything different mathematically?

Absolutely. Optical technology represents point data using light and gives us access to spectral non-local operations instantaneously. This is potentially very important because we might be able to make linear algebra on digital computers more efficient and more accurate. For example, we can do matrix multiplications or convolutions operations at the speed of light.