There was a problem on the movie set. The customer paying for the filming of a commercial would ask "Is this the real color that is going to be on film?" And the video assist technician would promptly turn off the color on all the video monitors. Another problem had to do with the process of conversion from 24 images per second (cinema) to 30 images per second (television). For every five television frames, one has to be repeated. This makes it look like there is a slight "glitch" in the tripod ... The problem can be solved, but that takes a considerable amount of fast computations.
On the home front, I was exploring the requirements of "artificial intelligence." Projects included multiprocessing; memory expansion and multiporting; rapid and asynchronous exchange of large quantities of data; elimination of time-consuming protocols; simplified vector-type graphic image generation and animation; optimum use of graphic memory; and special "intelligent" operating system software. The gist of my thinking is that hardware can serve as an extension of the CPU -- some instructions are executed in software, and others in hardware, depending on what is most efficient. Some of my software consists of "microcode" inputs for the recognition engine. I pay a lot of attention to data acquisition and processing in real time. This eliminates the need for input data buffers and paves the way for "seamless" neural processing in real time. By the way, I could never arrive at a satisfactory definition of "intelligence" -- and I noticed that nobody else could, either.