Hacker News new | past | comments | ask | show | jobs | submit login

Cursor tracks ok, but the implementation seems to replace a low-level pointing device. I.e., it's very precise and jittery - all attribution and no salience.

Also maybe like Siri it should be modal. E.g., dwell away to silence, and then dwell leading corner to say "Hey, listen..."

Holding the phone seemed to cause problems ("you're not holding it right"). Probably best with fixed positioning, e.g., attached to a screen (like a continuity camera, assuming you're lying down with a fixed head position.

Tracking needs a magnetic (gravity well?) behavior, where the pointer is drawn to UI features (and the user can retract by resisting). Salience weighting could make it quite useful.

It's possible that weighting could piggyback on existing accessibility metadata, or it might require a different application programming model.

Similarly, it would be interesting to combine it with voice input that prioritized things near where you are looking.

I'm willing to try, and eager to see how it gets integrated with other features.






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: