Phones like Samsung's Galaxy Note 10 and (probably) the upcoming Galaxy S11 are big, beautiful and powerful, but they haven't replaced true productivity machines like laptops . Part of the reason is that phone keyboards are smaller and more compact. Samsung may have a way to change that: using your phone's front camera to create an invisible keyboard.
The company, through its C-Lab initiative, which gives employees a year to work on research projects that could one day end up as products, showed off SelfieType at CES , the world's biggest tech convention. SelfieType works by combining artificial intelligence with a phone's front camera to map your finger joints and translate finger movements into keystrokes.
Virtual keyboards have appeared before, using laser projection, and they've never really caught on, given that they're too unreliable to work at speed. SelfieType is different in that it works by analyzing how your finger joints move, not by focusing on where you physically press down with your finger.
I was able to briefly demo SelfieType on the crammed CES show floor, and I was impressed. My demo was essentially a SelfieType tutorial, with the app guiding me through how different keys are mapped to different fingers, as curious show-goers huddled around to watch. The performance anxiety was real. It's definitely early days -- this is a research project and not a preview of a developed, market-ready application -- but the foundation was there.
Won Chun, one of the C-Lab researchers behind SelfieType, said the main challenge is getting more data. The algorithm that interprets joint movements will be more effective and versatile if the team can gather enough data on different typing styles and hand sizes.
Virtual keyboards have appeared before, using laser projection, and they've never really caught on, given that they're too unreliable to work at speed. SelfieType is different in that it works by analyzing how your finger joints move, not by focusing on where you physically press down with your finger.
I was able to briefly demo SelfieType on the crammed CES show floor, and I was impressed. My demo was essentially a SelfieType tutorial, with the app guiding me through how different keys are mapped to different fingers, as curious show-goers huddled around to watch. The performance anxiety was real. It's definitely early days -- this is a research project and not a preview of a developed, market-ready application -- but the foundation was there.
Won Chun, one of the C-Lab researchers behind SelfieType, said the main challenge is getting more data. The algorithm that interprets joint movements will be more effective and versatile if the team can gather enough data on different typing styles and hand sizes.
I, for instance, have bigger-than-average hands, and as a result type differently than most of the people from whom the team has so far gathered data. I also don't use my left pinkie to type, something SelfieType at present can't work with.
This could all be smoothed out, in theory. But Chun insists SelfieType is just a research project for now, and there's no plan for it to be in any upcoming devices. "For now" being the key phrase.
See more at: CNET