Syntax highlighting, and some types of text parsing might be good sample use cases for ego cells.
I was looking at the melee units in the game (micro-battleforge) and thought about the problem of making them be aware of each other so that they can do what we humans do when we're walking closely together - read the intentions of the one next to you. So, for example, if a cell or unit can see which direction the other is heading, they can basically agree on the next pixel to move to. This might be overkill for the game but it might be an interesting prototype. I think I first need to finish the first prototype to get the basics down.