Been a while, thoughts on life.

Chris Brooker
November 3, 2009

Cog in the machine

Hey All,

Yes, It’s been a long time since I’ve posted anything. It’s not because I’ve been sitting in the house and not doing anything. In fact, I’m busy most nights.

The truth is, I just didn’t feel like it.

I’ve been spending a lot of time thinking about what I want from life. What things I want to accomplish, what goals I want to complete. What adventures I want to have. Not an easy task.

Here are some things I started to help get my mind in a more receptive mode to hear itself.

– Stopped entirely reading the news. (I used to spend hours everyday reading hundreds of RSS feeds, which added no real value to my life.)(Also no Daily Show or the Colbert Report)

Stopped Blogging/Producing content with no vision. (Hence the lack of updates and videos)

Decided to sell my condo. (When I think about it, the place I’d like to most live right now (in this city) is right downtown. Will be moving as soon as my condo sells.)

Signed up for Aikido at the Japanese Canadian Culture Centre (JCCC) (Something I’ve always wanted to do.)

Stopped checking email (with the exception of checking work email at 11am and 4pm) Stopped receiving email on my phone entirely.)

Bought a notebook (paper) – If anyone knows me they know I rail against the use of paper. Getting back to basics has helped me focus. There are no unproductive distractions when it’s just you and paper.

– Try to only watch 1 hour of TV a day (my typical method of procrastination, with Hulu and a Media Centre PC there is never nothing to watch.)

– Everyday write down a list of Accomplishments and a List of Things to Accomplish the next day. (I do this for work too)

– Attend a Zen Buddhist Retreat (I’m not a religious person, but I feel it would be a great new experience to spent a few days in silence in a foreign environment, perhaps the change of perspective will be illuminating. If anything it will be a chance to try something new.)

That’s about it for now. Let me know what you do to keep life interesting and moving forward. Do you have a grand vision? Are you working towards it?

Optic nerve signal interception, interpretation, manipulation and reintroduction for human/computer interface for use in augmented reality

Chris Brooker
April 26, 2009

I have been doing a lot of thinking lately about human/computer interfaces in respect to augmented reality. There seem to be, at least to me, a lack of serious research in the field of true human/computer interfaces.

Sure, we’ve been able to control simple systems by training ourselves to modify our brainwaves or flexing muscles to control an artificial limb. But this is more us adapting to the interface than us truly being one with the machine.

I think the main problem has been the lack of very detailed research on how our nervous system actually works. More specifically for this post the optic nerve. There is book after book that describes the anatomy of the retina, optic nerve, optic chiasm, optic tracts, etc. But no where have I been able to find a book or paper on how precisely the optic nerve transfers the signal from the retina to the visual cortex.

It’s estimated that there are 1.2 million nerve fibers that make up the optic nerve.

  • What are these fibers?
  • Does each fiber behave like a wire transferring electrical impulses?
  • Do they work together as one large wire?
  • Do they work in groups of wires transmitting different information?
  • Are the fibers redundant groups transfering the same data in case one is damaged?
  • Do we have to tap each fiber?
  • Is the data transmission bidirectional?
  • Does the eye receive any information from the optic nerve?
  • What is the nature of the signal?
  • Is it straight bit data?
  • Is it a complex modulated signal? Etc.

I can’t seem to find answers to any of these questions. Perhaps I’m not looking hard enough or perhaps there has really been any research into it.If I had the resources required to answer these questions I would very much like to work towards the follow goals or thesis.


Is it possible to splice into the optic nerve (intercept) and feed the signal (for lack of a better term) into a computer, interpret those impulses into raw retinal image data that can be displayed on a computer screen (interpretation), manipulate the raw retinal image data (ie superimpose text) (manipulation) and injecting that altered signal back into the optic nerve for traditional processing by the visual cortex (reintroduction).

If anyone would like to discuss this stuff further drop me a line.



Here is a project that is on the leading edge of this type of research. However, they are interfacing with receptor cells in the retina with a prosthesis and not going straight to the signal. You will still require a functioning or partially functioning eye. Still great science!

Random Additions brought about through discussing this subject

“As to the actual task of intercepting the optic nerve signal, it seems to my blissful ignorance to be a rather straighforward thing. I’m not sure if anyone has actually been able to accomplish it. If you had to tap all 1.2 million optical nerve fibers I could see it being difficult ;).”

“I have a feeling that many of those nerve fibers are redundant, much like the rest of the brain, and we may only need to intercept a relatively small percentage of them to get the desired result, or at least a close approximation of it.

The technology that I think we should really watch out for it nanotechnology.  In theory the tiny things could independently search out the nerves and attach themselves to it, requiring simply an injection and not risky surgery.  Some form of radio wave or  microwave could then be used to communicate with them, ie. receiving and sending video feed.  Frankly this kind of thing could be extrapolated to any part of the brain.  For example you could create a “movie” where the viewer feels everything the character feels, including all the senses, emotion,  even thoughts.”


There is also the great work that is being done at the Human Connectome Project

More specific to this post, here is a dataset from the University of Utah. The Retinal Connectome Mosaics.


Here is another advancement (April 5, 2010);

Researchers in Australia have developed a “wide-view neurosimulator,” to help give sight back to the blind. By implanting electrodes in the eye, they’ll allow those with degenerative vision loss to see a pixelated version of the world around us.

Related Posts with Thumbnails